The author establishes a correspondence between graph neural networks and arithmetic circuits, highlighting the computational abilities of GNNs. By studying GNNs as devices operating with real numbers, they shed light on their computational model.
The author demonstrates that the outputs of graph neural networks converge to constant functions on random graphs, limiting their expressive power.
The Weisfeiler-Leman (WL) test is commonly used to measure the expressive power of graph neural networks, but this approach has significant limitations and ethical implications that are often overlooked.
Graph Neural Networks (GNNs) have the same expressive power as the Weisfeiler-Lehman (WL) test, and they are universal approximators modulo the constraints enforced by the WL/unfolding equivalence, for both attributed static graphs and dynamic graphs.