Concepts de base
Efficient cross-lingual transfer with a Multilingual Prompt Translator enhances natural language inference by translating soft prompts to target languages.
Résumé
Authors and Affiliations:
Xiaoyu Qiu, University of Science and Technology of China, Hefei, China
Yuechen Wang, University of Science and Technology of China, Hefei, China
Jiaxin Shi, Cloud BU, Huawei Technologies, Beijing, China
Wengang Zhou, University of Science and Technology of China, Hefei, China
Houqiang Li, University of Science and Technology of China, Hefei, China
Abstract:
Multilingual Pre-trained Models (PLMs) enable effective cross-lingual transfer with prompt learning.
Introduction:
PLMs require vast annotated data for fine-tuning in low-resource languages.
Methodology:
Multilingual Prompt Translator (MPT) optimizes soft prompts for efficient cross-lingual transfer.
Experiments:
MPT outperforms baselines in few-shot settings on XNLI by enhancing performance significantly.
Related Work:
Prior works focus on cross-lingual transfer leveraging multilingual PLMs and prompt learning techniques.
Stats
To efficiently transfer soft prompt, we propose a novel framework called Multilingual Prompt Translator (MPT).
MPT demonstrates superiority over baselines in few-shot settings on XNLI.
MPT showcases an impressive enhancement in relative performance when transferring to languages quite distinct from the source language.
Citations
"Based on multilingual pre-trained models, cross-lingual transfer with prompt learning has shown promising effectiveness."
"MPT is more prominent compared with vanilla prompting when transferring to languages quite distinct from the source language."