The study introduces Kernel Normalization (KernelNorm) as an alternative to BatchNorm in convolutional neural networks. By incorporating KernelNorm and KNConv layers, KNConvNets achieve superior performance compared to BatchNorm counterparts. The overlapping normalization units of KernelNorm allow for better utilization of spatial correlation during normalization, leading to faster convergence rates and improved accuracy. Experimental results demonstrate the effectiveness of KNResNets in image classification and semantic segmentation tasks across various datasets.
The research compares the performance of KNResNets with BatchNorm, GroupNorm, LayerNorm, and LocalContextNorm counterparts. KNResNets consistently outperform the competitors in terms of accuracy, convergence rate, and generalizability. Additionally, the study explores the computational efficiency and memory usage of KNResNets compared to batch normalized models.
Furthermore, the study highlights the potential applications of KernelNormalization beyond ResNets, showcasing its effectiveness in architectures like ConvNext. Future work may focus on optimizing the implementation of KNResNets for even greater efficiency and scalability.
Na inny język
z treści źródłowej
arxiv.org
Głębsze pytania