Core Concepts
PROM enhances phrase-level copying for improved abstractive summarization.
Abstract
PROM introduces a new PhRase-level cOpying Mechanism to enhance attention on n-grams, improving factuality and stability in abstractive summarization. The method adds an indicator layer to select tokens in n-grams that can be copied from the source, leading to significant improvements in fine-tuning benchmarks. PROM is utilized in self-supervised pre-training on raw corpora, providing new general baselines on various summarization datasets. The model shows higher similarity to reference summaries, better factuality towards input passages, and scalability for various domains.
Stats
PROM makes significant improvements in fine-tuning benchmarks.
PROM provides new general baselines on a wide range of summarization datasets.