Core Concepts
This paper introduces MA-DARTS, a novel neural architecture search algorithm that leverages multi-dimensional attention to design highly accurate and energy-efficient Spiking Neural Networks (SNNs) by optimizing network structure and minimizing neuron spikes.
Stats
MA-DARTS achieves 94.40% accuracy on CIFAR10 and 76.52% accuracy on CIFAR100 with 64 initial channels and 2 timesteps.
The model stabilizes at approximately 110K spikes on validation and 100K spikes on training for CIFAR10.
The ECA-based attention function improves accuracy by 0.51% on CIFAR10 and 0.82% on CIFAR100.
The CBAM-based attention function improves accuracy by 0.61% on CIFAR10 and 1.52% on CIFAR100.