The paper proposes a two-stage data augmentation framework called CORE (COmplete and REduce) for the link prediction task.
The Complete stage addresses the incompleteness of the graph by incorporating highly probable edges, resulting in a more comprehensive graph representation. The Reduce stage, which is the core of the proposed method, operates on the augmented graph generated by the Complete stage. It aims to shrink the edge set while preserving those critical to the link prediction task, effectively mitigating any misleading information either inherently or introduced during the Complete stage.
CORE adheres to the Information Bottleneck (IB) principle, which constrains the flow of information from the input to the output, enabling the acquisition of a maximally compressed representation while retaining its predictive relevance to the task at hand. This allows CORE to learn compact and predictive augmentations for link prediction models, enhancing their robustness and performance.
The authors also recognize that predicting different links may require distinct augmentations. To address this, they recast the link prediction task as a subgraph link prediction, where they can apply independent augmentations to the neighboring links without concerns about potential conflicts between their preferred augmentations.
Extensive experiments on multiple benchmark datasets demonstrate the applicability and superiority of CORE over state-of-the-art methods, showcasing its potential as a leading approach for robust link prediction in graph representation learning.
Ke Bahasa Lain
dari konten sumber
arxiv.org
Wawasan Utama Disaring Dari
by Kaiwen Dong,... pada arxiv.org 04-18-2024
https://arxiv.org/pdf/2404.11032.pdfPertanyaan yang Lebih Dalam