The content describes a novel approach for processing neuromorphic sensor data, which encodes environmental changes as asynchronous event streams. The key challenges in modeling such event-streams are:
The authors propose using linear state-space models (SSMs) as the core of their approach, called Event-SSM. SSMs can model long-range dependencies effectively and be parallelized efficiently along the sequence dimension.
To handle the asynchronous nature of the event data, the authors introduce a novel discretization method for SSMs that integrates each event independently, without relying on regular time steps. This allows the model to process the event-stream directly in an event-by-event fashion.
The authors evaluate their Event-SSM model on three neuromorphic datasets - Spiking Heidelberg Digits, Spiking Speech Commands, and DVS128 Gestures. They demonstrate state-of-the-art performance on these benchmarks, outperforming prior methods that rely on converting the event-streams into frames. Notably, their model achieves these results without using any convolutional layers, learning spatio-temporal representations solely from the recurrent SSM structure.
The authors also conduct an ablation study to show the importance of their proposed asynchronous discretization method compared to alternative approaches. Overall, this work presents a scalable and effective solution for processing neuromorphic sensor data, paving the way for wider adoption of event-based sensing in real-world applications.
翻译成其他语言
从原文生成
arxiv.org
更深入的查询