Tuesday, April 19, 2022

Lecture 7H (2022-04-19): From Spiking Neural Networks to Continual Learning and Beyond

In this lecture, we continue our discussion of neuromorphic engineering, with a focus on spiking neural network (SNN) architectures. We review the basic dynamics of an "action potential" and mention a few ODE models (Hodgkin–Huxley, integrate-and-fire, etc.) of such dynamics. Modern SNN platforms, such as SpiNNaker and more modern IBM TrueNorth and Intel Loihi hardware/"chip" solutions, implement hardware and software emulations of these dynamics in an effort to simulate large networks of spiking neurons (as opposed to mathematical abstractions, as in more traditional ANNs). We also discuss "neuromemristive" platforms that make use of hysteretic "memristors" as very simple artificial spiking neurons and mention an example from Boyn et al. (2017, Nature Communications) of a crossbar architecture of such memristors that can accomplish unsupervised learning for pattern recognition/classification. We then move on to discussing how backpropagation (gradient descent) can now be used for supervised learning on spiking neural networks (to simplify training significantly). That brings us to discuss state-of-the-art and beyond-state-of-the-art nature-inspired directions, such as using neural network "sleep" to improve continual learning and introducing "neuromodulation" to add further richness to artificial neural networks.

Whiteboard notes for this lecture can be found at: https://www.dropbox.com/s/31ti6sni3zpkw64/IEE598-Lecture7H-2022-04-19-From_Spiking_Neural_Networks_to_Continual_Learning_and_Beyond.pdf?dl=0



No comments:

Post a Comment

Popular Posts