Concetti Chiave
Non-backtracking graph neural networks (NBA-GNNs) resolve the redundancy issue in conventional message-passing graph neural networks by preventing messages from revisiting previously visited nodes.
Sintesi
The paper proposes a non-backtracking graph neural network (NBA-GNN) to address the redundancy issue in conventional message-passing graph neural networks (GNNs).
Key insights:
- Conventional GNNs suffer from backtracking, where a message flows through the same edge twice and revisits a previously visited node. This leads to an exponential increase in the number of message flows, causing the GNN to become insensitive to particular walk information.
- NBA-GNN associates hidden features with transitions between pairs of vertices and updates them using non-backtracking transitions, preventing messages from revisiting previously visited nodes.
- The authors provide a sensitivity analysis to show that NBA-GNN alleviates the over-squashing issue in GNNs by improving the upper bound on the Jacobian-based measure of over-squashing.
- NBA-GNN is shown to be more expressive than conventional GNNs, with the ability to recover sparse stochastic block models with an average degree as low as ω(1) and no(1).
- Empirical evaluations demonstrate that NBA-GNN achieves state-of-the-art performance on the long-range graph benchmark and consistently improves over conventional GNNs on transductive node classification tasks.
Statistiche
The number of message flows in conventional GNNs increases exponentially with the number of updates.
NBA-GNN reduces the redundancy in message flows by preventing messages from revisiting previously visited nodes.
Citazioni
"Since the message-passing iteratively aggregates the information, the GNN inevitably encounters an exponential surge in the number of message flows, proportionate to the vertex degrees."
"Reducing the redundancy by simply considering non-backtracking walks would benefit the message-passing updates to recognize each walk's information better."