The paper introduces a Retrieval-Augmented Generation-based Relation Extraction (RAG4RE) approach to identify the relationship between a pair of entities in a sentence. The proposed RAG4RE approach consists of three modules: Retrieval, Data Augmentation, and Generation.
The Retrieval module sends the user's query (sentence with a pair of entities) to the Data Augmentation module, which extends the original query with a semantically similar sentence from the training dataset. The prompt generator then combines the user's query and the relevant example sentence to create the final prompt, which is fed into the Generation module.
The authors evaluate the effectiveness of their RAG4RE approach using well-established Relation Extraction benchmarks, including TACRED, TACREV, Re-TACRED, and SemEval. They integrate various Large Language Models (LLMs), such as Flan T5, Llama2, and Mistral, into their approach.
The results show that the RAG4RE approach outperforms the simple query (Vanilla LLM prompting) in terms of micro F1 score on the TACRED, TACREV, and Re-TACRED datasets. The authors attribute this improvement to the integration of the relevant example sentence, which helps mitigate hallucination issues in the LLMs. However, the RAG4RE approach did not perform as well on the SemEval dataset, as the predefined relation types in this dataset cannot be directly extracted from the sentence tokens.
The authors also compare their RAG4RE approach with state-of-the-art Relation Extraction methods and demonstrate that it surpasses the performance on the TACRED and TACREV datasets.
إلى لغة أخرى
من محتوى المصدر
arxiv.org
الرؤى الأساسية المستخلصة من
by Sefika Efeog... في arxiv.org 04-23-2024
https://arxiv.org/pdf/2404.13397.pdfاستفسارات أعمق