Chebyshev Approximation in Graph Neural Networks Revisited
Core Concepts
ChebNetII improves spectral graph convolutions by addressing over-fitting and the Runge phenomenon, outperforming existing methods.
Abstract
The content discusses the challenges in designing spectral convolutional networks on graphs. It introduces ChebNet as an early attempt using Chebyshev polynomials for spectral graph convolutions. The paper revisits ChebNet's performance issues due to illegal coefficients leading to over-fitting. ChebNetII is proposed as a new GNN model based on Chebyshev interpolation, enhancing performance and scalability. Experimental studies demonstrate ChebNetII's superior performance in both semi-supervised and full-supervised node classification tasks.
Abstract:
Designing spectral convolutional networks is challenging.
ChebNet uses Chebyshev polynomials for graph convolutions.
ChebNetII addresses over-fitting and the Runge phenomenon.
Introduction:
Graph neural networks have gained attention for various tasks.
Spatial-based vs. spectral-based GNNs are discussed.
Predetermined vs. learnable graph convolutions are explained.
Revisiting ChebNet:
The issue of approximating spectral graph convolutions with Chebyshev polynomials is revisited.
Illegal coefficients in ChebNet lead to inferior performance due to over-fitting.
ChebNetII is introduced as a new GNN model based on Chebyshev interpolation.
Data Extraction:
GCN simplifies using only the first two Chebyshev polynomials.
GPR-GNN and BernNet outperform using Monomial and Bernstein bases.
Convolutional Neural Networks on Graphs with Chebyshev Approximation, Revisited
"ChebyBase has the worst performance despite theoretical approximation ability."
"Why is ChebNet’s filter inferior to that of GPR-GNN and BernNet?"
"ChebyBase/k outperforms other methods with simple penalty constraints."