Multilingual Summarization with Parameter-Efficient Fine-Tuning: An Empirical Study of Low-Rank Adaptation
Low-Rank Adaptation (LoRA) is a competitive alternative to full fine-tuning for multilingual summarization, particularly in low-data and cross-lingual transfer scenarios.