Auto-Constriction Turning (ACT-MNMT) improves multilingual machine translation by addressing off-target issues and enhancing model understanding.
Effektive und effiziente Anpassung von vortrainierten Sprachmodellen für textzentriertes Verständnis.
CoT-ER ist eine effektive Methode für Few-shot-Beziehungsextraktion, die auf in-context Learning basiert und übertrifft andere Baseline-Methoden.
Large language models can effectively perform few-shot relation extraction tasks with the CoT-ER approach, outperforming fully-supervised methods.
시스템 생성 요약에 키워드가 포함되어 있는지를 평가하는 ROUGE-K 키워드 지향 평가 메트릭이 중요하다.
Large Language Models (LLMs) significantly impact RST discourse parsing, achieving state-of-the-art results.
텍스트, 오디오, 비디오를 통합한 감정 원인 분석의 중요성
Unsupervised pretraining framework SFAVEL achieves state-of-the-art results in fact verification by distilling language model features.
Ein neuer Ansatz für die Bewertung von Natural Language Preprocessing-Systemen wird vorgestellt, um die Leistung fair und zuverlässig zu bewerten.
Large Language Models can improve zero-shot information extraction by following annotation guidelines.