The study introduces Kernel Normalization (KernelNorm) as an alternative to BatchNorm in convolutional neural networks. By incorporating KernelNorm and KNConv layers, KNConvNets achieve superior performance compared to BatchNorm counterparts. The overlapping normalization units of KernelNorm allow for better utilization of spatial correlation during normalization, leading to faster convergence rates and improved accuracy. Experimental results demonstrate the effectiveness of KNResNets in image classification and semantic segmentation tasks across various datasets.
The research compares the performance of KNResNets with BatchNorm, GroupNorm, LayerNorm, and LocalContextNorm counterparts. KNResNets consistently outperform the competitors in terms of accuracy, convergence rate, and generalizability. Additionally, the study explores the computational efficiency and memory usage of KNResNets compared to batch normalized models.
Furthermore, the study highlights the potential applications of KernelNormalization beyond ResNets, showcasing its effectiveness in architectures like ConvNext. Future work may focus on optimizing the implementation of KNResNets for even greater efficiency and scalability.
إلى لغة أخرى
من محتوى المصدر
arxiv.org
الرؤى الأساسية المستخلصة من
by Reza Nasirig... في arxiv.org 03-06-2024
https://arxiv.org/pdf/2205.10089.pdfاستفسارات أعمق