Core Concepts
Polynormer is a polynomial-expressive graph transformer with linear complexity that outperforms state-of-the-art models on various graph datasets.
Abstract
Introduction to Graph Transformers: Discusses the limitations of traditional GNNs and the emergence of graph transformers.
Polynomial-Expressive Model: Introduces Polynormer, a novel graph transformer with high polynomial expressivity.
Base Model and Attention Mechanisms: Details the base model and the integration of local and global attention mechanisms.
Experimental Results: Highlights the superior performance of Polynormer on homophilic and heterophilic graphs, as well as large-scale datasets.
Ablation Analysis: Compares different attention schemes in Polynormer and their impact on performance.
Visualization: Showcases the importance of nodes in Polynormer through heatmaps.
Conclusions and Acknowledgments: Summarizes the key findings and acknowledges the support received.
Stats
Typical GT models have at least quadratic complexity and struggle to scale to large graphs.
Polynormer outperforms state-of-the-art GNN and GT baselines on most datasets.
Polynormer achieves an accuracy improvement of up to 4.06% on large graphs.
Quotes
"Polynormer is built upon a novel base model that learns a high-degree polynomial on input features."
"Our extensive experiment results show that Polynormer outperforms state-of-the-art GNN and GT baselines on most datasets."