Minimizing Chebyshev Prototype Risk Reduces Overfitting in Deep Neural Networks
Minimizing the Chebyshev Prototype Risk, which bounds the deviation in similarity between an example's features and its class prototype, reduces overfitting in deep neural networks.