DOI: https://doi.org/10.29363/nanoge.matnec.2022.015
Publication date: 23rd February 2022
Neuromorphic Perception (and Computation) will induce a paradigm shift in robotics, based on the biologically inspired emerging concept of event-driven sensing and processing that leads to better robots able to acquire, transmit and process information only when needed, optimising the use of resources, leading to real-time, low-cost, operation.
In this talk, I will describe our approach to implement neuromorphic tactile sensing. The sparseness of tactile input over space and time calls for ED encoding, where the sensors are not continuously sampled, rather wake-up at stimulation. I'll describe how we emulate the firing properties of biological tactile afferents. I will then describe different flavours of approaches to extract information from neuromorphic vision sensors that can be useful in robotics. We will span from low-latency and real-time computer vision tasks to biologically inspired architectures. The neuromorphic approach will greatly enhance computer vision for robots that have to interact with objects and people in real time, adapting to sudden changes, failures and uncertainties.
This work has been partially supported by the European Union’s Horizon 2020 research and innovation programme (MSCA-ETN NeuTouch, Grant agreement No. 813713).