The paper proposes an intuitive Prompt-tuning method for Clickbait detection via Text Summarization (PCTS). The key highlights are:
To address the huge gap between news headlines and contents, the authors introduce a two-stage text summarization model (SummaReranker) to generate high-quality news summaries. Both the headlines and the generated summaries are then used as inputs for the prompt-tuning model.
Five different strategies are employed to construct an effective verbalizer for the prompt-tuning model, capturing various characteristics of the expanded words. These strategies include Concepts Retrieval, BERT Prediction, FastText Similarity, Frequency-based Selection, and Contextual Information.
The prompt-tuning model transforms the clickbait detection task into a cloze-style objective, where the model predicts the masked label based on the input headlines and summaries.
Extensive experiments on well-known clickbait detection datasets demonstrate that the proposed PCTS method achieves state-of-the-art performance, particularly in low-data scenarios.
Ablation studies confirm the importance of text summarization in bridging the gap between headlines and content, as well as the effectiveness of the verbalizer construction strategies in improving clickbait detection.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Haoxiang Den... at arxiv.org 04-18-2024
https://arxiv.org/pdf/2404.11206.pdfDeeper Inquiries