核心概念
Graph neural network outputs converge to constant functions on random graphs, providing an upper bound on their expressiveness.
統計
Graph neural network outputs converge to constant functions on random graphs.
Probabilistic classifiers converge to constant outputs as graph size increases.
Empirical validation of convergence across different model initializations.
引用
"Graph neural network outputs converge to constant functions on random graphs."
"Probabilistic classifiers converge to constant outputs as graph size increases."