Hector is a novel two-level intermediate representation and code generation framework that systematically addresses the performance and programming challenges of implementing relational graph neural networks (RGNNs) on GPU architectures.
This paper presents new Graph Neural Network models that incorporate two first-order Partial Differential Equations (PDEs) - the advection equation and the Burgers equation. These models effectively mitigate the over-smoothing problem in GNNs while maintaining comparable performance to higher-order PDE models.
Aggregation-diffusion equations on graphs can exhibit metastable behavior, leading to clustered node representations that mitigate over-smoothing in graph neural networks.
DepWiGNN introduces a novel approach for multi-hop spatial reasoning in text, focusing on depth-wise propagation in graphs to capture long dependencies effectively.
Proposing a novel Depth-Wise Graph Neural Network (DepWiGNN) to address multi-hop spatial reasoning challenges by operating over the depth dimension of the graph.