Proceedings of Materials for Sustainable Development Conference (MAT-SUS) (NFM22)
DOI: https://doi.org/10.29363/nanoge.nfm.2022.208
Publication date: 11th July 2022
Event-based sensor are bio-inspired vision sensors, encoding visual information in the form of sparse asynchronous events. Each event encodes a change in the log-luminosity intensity at a given pixel location. As a consequence, event sensors capture information at extremely high temporal resolution and with high dynamic range, while keeping data and power consumption small.
Converting a sparse events stream to a dense representation allows the use of classical AI methods, such as convolutional neural networks, to event data. This is a practical solution to leverage the well-established AI ecosystem and results in outperforming classical frame-based cameras methods in latency-critical and high dynamic range scenarios. However, the low power and low data property of the camera are lost.
Neuromorphic AI will be the cornerstone of the next generation event-based vision pipelines. In fact, the asynchronous and ultra-low power computation paradigms of neuromorphic architectures are the perfect fit to process event-based data. However, several research directions, both from the algorithmic side and hardware implementation side, are still open and need to be explored.