The author proposes A&B BNN to eliminate multiplication operations in traditional BNNs, achieving competitive results on various datasets. The approach involves a mask layer and quantized RPReLU structure for hardware-friendly network architecture.
バイナリニューラルネットワークの新しいアーキテクチャ、A&B BNNは、推論時にすべての乗算演算を排除することを目指しています。
A&B BNN introduces an innovative approach to eliminate multiplication operations in binary neural networks, achieving competitive performance with state-of-the-art models.