NeuroPrune: A Neuro-inspired Sparse Training Algorithm for Efficient Large Language Models
NEUROPRUNE is a neuro-inspired topological sparse training algorithm that exploits mechanisms seen in biological networks, such as preferential attachment and redundant synapse pruning, to achieve performant and efficient large language models across diverse NLP tasks.