The content discusses how graph neural networks (GNNs) suffer from two key problems: over-squashing and over-smoothing. Over-squashing refers to the issue where information from distant nodes is not effectively propagated due to topological bottlenecks in the graph, while over-smoothing occurs when node features converge to a non-informative limit due to repeated rounds of aggregation.
The authors propose a novel approach to address these issues by leveraging the Braess paradox, which states that deleting edges can sometimes improve the spectral gap of a graph. They develop two algorithms, PROXYDELETE and PROXYADD, that efficiently estimate the change in the spectral gap when deleting or adding edges, respectively. These algorithms can be used to prune the graph in a way that simultaneously mitigates over-squashing and over-smoothing.
The authors provide theoretical insights to explain why deleting edges can be more effective than adding edges in addressing both problems, especially in heterophilic learning tasks where nodes of different classes are connected. They demonstrate the effectiveness of their proposed methods on various node classification and graph classification tasks, showing consistent improvements over existing baselines.
Additionally, the authors connect their insights on spectral graph rewiring to the problem of finding graph lottery tickets, which are sparse subnetworks that can match the performance of dense GNN models. They show that their spectral gap-based pruning approach can be used to find effective lottery tickets at initialization, without the need for computationally expensive prune-train-rewind cycles.
Til et annet språk
fra kildeinnhold
arxiv.org
Viktige innsikter hentet fra
by Adarsh Jamad... klokken arxiv.org 04-09-2024
https://arxiv.org/pdf/2404.04612.pdfDypere Spørsmål