A lightweight, causal spatiotemporal convolutional network that can perform efficient online inference on event-based data for eye tracking applications.
This survey reviews the AIS 2024 Event-Based Eye Tracking (EET) Challenge, which focused on developing efficient algorithms for processing eye movement data from event cameras to accurately predict pupil center. The challenge aimed to advance eye tracking technologies that are both energy-efficient and suitable for real-time applications in AR/VR and wearable healthcare devices.
A novel bidirectional and selective recurrent model, MambaPupil, is proposed to effectively leverage temporal context information for accurate and stable event-based eye tracking, outperforming state-of-the-art methods.
A hardware-software co-designed event-based eye tracking system that leverages submanifold sparse convolution neural networks to achieve sub-millisecond latency, low power consumption, and high precision.
FACET, a fast and accurate end-to-end neural network that directly outputs pupil ellipse parameters from event data, optimized for real-time extended reality applications.