Core Concepts
Graph Neural Networks and arithmetic circuits have an exact correspondence in computational power.
Abstract
This article explores the computational power of Graph Neural Networks (GNNs) compared to arithmetic circuits over real numbers. It delves into the expressivity of GNNs and their relation to Boolean threshold circuits. The study focuses on the computational complexity after training, emphasizing the expressive power of GNNs. The paper introduces Circuit Graph Neural Networks (C-GNNs) as a model to compute functions over real numbers or labeled graphs. It establishes a close correspondence between GNNs and arithmetic circuits, shedding light on the computational abilities of GNNs. The results highlight the importance of scaling and complexity in neural networks. The article follows a modular approach, providing insights that can guide practical development in network architecture.
Introduction
Computational power comparison of Graph Neural Networks and arithmetic circuits.
Study on the expressiveness and computational complexity of GNNs.
Background and Related Work
Neural networks' theoretical attention and computational properties.
Complexity of training processes and their relation to real numbers.
Expressive power of feed-forward neural networks and Boolean threshold circuits.
Graph Neural Networks using Circuits
Introduction of Circuit Graph Neural Networks (C-GNNs).
Model of computation and simulation of C-GNNs with arithmetic circuits.
Simulating arithmetic circuits with C-GNNs and the structural restrictions imposed.
Conclusion
Correspondence between GNNs and arithmetic circuits.
Further research directions on network expressiveness and training complexity.
Stats
Neural networks training complexity is ∃R-complete.
Expressive power of GNNs related to Boolean threshold circuits.
GNNs closely related to the Weisfeiler-Leman algorithm.
Quotes
"Training fully connected neural networks is ∃R-complete." - Bertschinger et al., 2022
"The logical expressiveness of Graph Neural Networks." - Barceló et al., 2020