Efficient Distributed Sampling Algorithms for Scalable Graph Neural Network Training
This work proposes new matrix-based methods for reducing communication in the sampling step of distributed Graph Neural Network (GNN) training. The authors introduce algorithms to express node-wise and layer-wise sampling as sparse matrix operations, enabling efficient distributed sampling on GPUs.