核心概念
Pre-trained transformer models excel in emotion classification tasks, with twitter-roberta-base achieving 92% accuracy despite limited training data.
統計資料
The pre-trained architecture of twitter-roberta-base achieves an accuracy of 92%.
引述
"Elements like punctuation and stopwords can still convey sentiment or emphasis and removing them might disrupt this context." - Mahdi Rezapour