Simultaneous Linear Connectivity of Neural Networks Modulo Permutation
Neural networks exhibit permutation symmetry, where reordering neurons in each layer does not change the underlying function they compute. This contributes to the non-convexity of the networks' loss landscapes. Recent work has argued that permutation symmetries are the only sources of non-convexity, meaning there are essentially no loss barriers between trained networks if they are permuted appropriately. This work refines these arguments into three distinct claims of increasing strength, and provides empirical evidence for the strongest claim.