Concetti Chiave
Language models can benefit from understanding emergent communication to improve language acquisition research.
Sintesi
The abstract highlights the importance of finding commonalities between language models and humans for breakthroughs in language evolution.
Neural language models have influenced linguistic theories, showcasing a thriving relationship between machine learning and linguistics.
Emergent communication literature excels at designing models to recover linguistic phenomena absent in natural languages.
Key pressures identified include communicative success, efficiency, learnability, and psycho-/sociolinguistic factors.
The content discusses mismatches between neural agents and humans regarding Zipf's law of abbreviation, compositional structure benefits, and group size effects.
Resolutions are provided for these mismatches by introducing inductive biases like laziness, impatience, and population heterogeneity.
Various cognitive and psycholinguistic factors affecting language evolution are explored along with pressures for successful communication, efficient communication, learnability, and memory constraints.
Statistiche
"The unprecedented success of language models in recent years provides many opportunities to further advance our understanding of human language learning." - Bahdanau et al., 2015; Brown et al., 2020; Devlin et al., 2019; Raffel et al., 2020; Vaswani et al., 2017
Citazioni
"The emergence of new communication systems is similarly studied using deep neural network models."
"Compositional structure is considered a hallmark feature of human language."
"In general, while emergent communication simulations are tuned for communicative success by design..."