DOI: https://doi.org/10.29363/nanoge.matnec.2022.017
Publication date: 23rd February 2022
Resistive random access memory (RRAM) technologies, often referred to as memristors, hold fantastic promise for implementing novel in-memory computing systems for massively parallel, low-power and low-latency computation. Compared to conventional systems, these solutions offer promising advantages in terms of energy efficiency and computing power when processing AI workloads. This talk will first present the role of RRAM to enable the hardware implementation of resource-constrained neuromorphic hardware. I will present a event-driven object localization system that couples state-of-the-art piezoelectric micromachined ultrasound transducer (pMUT) sensors to a neuromorphic resistive memories-based computational map. Second, I will present an approach where the random nature of resistive memories, instead of being mitigated, is fully exploited to implement a type of probabilistic learning in Bayesian neural networks. The inherent variability of resistive memories can naturally implement the sampling step in the Metropolis-Hastings Markov Chain Monte Carlo algorithm. This approch constitutes a new path to bring learning to the edge.