Theoretical Bounds on the Generalization Performance of Neural Belief Propagation Decoders
This paper presents new theoretical results that bound the generalization gap of neural belief propagation (NBP) decoders, which is the difference between the empirical and expected bit-error-rates. The bounds demonstrate the dependence of the generalization gap on the decoder complexity, code parameters, decoding iterations, and the training dataset size.