Proceedings of Materials for Sustainable Development Conference (MAT-SUS) (NFM22)
DOI: https://doi.org/10.29363/nanoge.nfm.2022.259
Publication date: 11th July 2022
The recent discovery of surrogate gradient learning (SGL) has been a game changer for the more biology inspired spiking neural networks (SNNs). In short, by solving non-differentiability issues, it reconciles SNNs with backpropagation, THE algorithm that caused the deep learning revolution. SNNs and conventional artificial neural networks (ANNs) can now be trained using the same algorithm and the same auto-differentiation enabled tools (e.g. PyTorch or TensorFlow). This bridges the gap between SNNs and ANNs, and makes the comparison between them fairer.
In this talk, I will review recent works in which we show that SNNs trained with SGL can solve a broad range of problems, just like ANNs, but possibly with orders of magnitude less energy, once implemented on event-based hardware. These problems include image and sound classification, depth and optic flow estimation from event-based cameras, encrypted Internet traffic classification, epileptic seizure detection from electro-encephalograms, etc.