Efficient and Scalable Graph Transformers with Anchor-based Attention Architecture
The proposed AnchorGT architecture improves the scalability of graph Transformer models by using a novel anchor-based attention mechanism, while maintaining the global receptive field and expressive power.