Kernekoncepter
Pre-trained language models effectively generalize to code-switched text, revealing insights into their capabilities.
Statistik
"Our findings reveal that pre-trained language models are effective in generalising to code-switched text."
"Our results indicate that PLMs are able to distinguish between CS text at a sentence level and token level in our detection experiment."
Citater
"Despite its widespread use online and recent research trends in this area, research in code-switching presents unique challenges."
"Our findings seem to indicate that PLMs are surprisingly good at generalising across CS text."