Proposing a novel spiking self-attention mechanism, Dual Spike Self-Attention (DSSA), and introducing SpikingResformer architecture to enhance performance and energy efficiency in SNNs.
提案されたSpikingResformerは、ResNetとVision Transformerを組み合わせ、SNNの性能とエネルギー効率を向上させます。
Spiking Neural Networks bieten eine energieeffiziente Alternative zu Deep Learning, und Spyx ermöglicht eine optimale Hardwareauslastung für die Optimierung von SNNs.
Spiking neural networks (SNN) and spiking neural P systems (SNPS) are compared, focusing on learning algorithms and real-life applications.
This paper introduces the Parallel Resonate and Fire (PRF) neuron, a novel approach to improve long sequence learning in spiking neural networks by enabling parallel training and enhancing the capture of long-range dependencies.
This paper introduces P-SpikeSSM, a novel spiking neural network architecture that leverages probabilistic spiking state space models to efficiently address long-range dependencies in sequence learning tasks, outperforming traditional architectures in terms of accuracy and computational efficiency.
本稿では、従来のニューラルネットワークに比べて計算効率と生物学的妥当性の高い代替手段として期待されているスパイクニューラルネットワーク(SNN)において、状態空間モデル(SSM)の原理を活用し、長期的な依存関係を持つシーケンス学習タスクに効果的に対処できる、スケーラブルな確率的スパイク学習フレームワークを提案する。
본 논문에서는 기존의 LIF 뉴런 기반 SNN 모델의 한계를 극복하고 장거리 의존성을 효과적으로 처리하는 확률적 스파이킹 학습 프레임워크인 P-SpikeSSM을 제안합니다. P-SpikeSSM은 상태 공간 모델을 기반으로 스파이크 생성을 확률적으로 처리하고 병렬 연산을 가능하게 하여 계산 효율성을 높입니다. 또한, 스파이크 믹서 블록과 클램프퓨즈 레이어를 도입하여 뉴런 간의 정보 전달을 향상시키고 복잡한 의존성을 효과적으로 포착합니다.
本文提出了一種基於機率脈衝狀態空間模型 (P-SpikeSSM) 的新型脈衝神經網路 (SNN) 架構,用於處理具有長距離依賴關係的序列學習任務,並在準確性和計算效率方面超越了現有的 SNN 和傳統神經網路模型。