Retrieval-Enhanced Knowledge Editing for Improving Multi-Hop Question Answering in Language Models
The core message of this article is that the authors propose a novel Retrieval-Augmented Editing (RAE) framework to effectively handle multi-hop questions in language model editing. RAE first retrieves the most relevant edited facts using a mutual information-based retrieval strategy, and then refines the language model through in-context learning with the retrieved facts.