DOI: https://doi.org/10.29363/nanoge.neumatdecas.2023.019
Publication date: 9th January 2023
The success of Artificial Neural Networks relies on End-to-End training with BackProp, but Biological Neural Networks use local learning rules - and understanding biological rules should also give insight for designing neuromorphic materials and devices. I will review Hebbian two-factor rules as well as their generalization to three-factor rules for action learning [1]. I will then present our recent work on multi-factor learning rules that enabled us to successfully learn representations in networks of up to six layers [2].
The slogan of my biological modeling work is 'No BackProp, Please!' - and we can discuss after the talk whether this should also be slogan for Neuromorphic Materials and Devices.
[2] B. Illing, J. Ventura, G. Bellec, and W. Gerstner (2021)
Local plasticity rules can learn deep representations using self-supervised contrastive predictions
35th Conference on Neural Information Processing Systems (NeurIPS 2021)