Generalization Bounds for Message Passing Neural Networks on Sparse Noisy Graphs Sampled from a Mixture of Graphons
Message passing neural networks (MPNNs) can generalize effectively to unseen sparse, noisy graphs sampled from a mixture of graphons, as long as the graphs are sufficiently large.