In this lecture, we introduce spiking neural networks and neuromorphic computing, starting with a refresher of the biological neuron and an introduction to Carver Mead, one of the founders of modern neurlmorphic computing. We discuss the Leaky Integrate and Fire (LIF) model for a spiking neuron and spike-timing dependent plasticity (STDP) for (unsupervised) learning of these neurons (temporary/working memory). We focus on rate coding and show examples of rate coded signals as inputs and outputs from LIF neurons. We introduce SNN implementations from SpiNNaker to IBM TrueNorth to Intel Loihi and a crossbar array memristor example published in 2017 that shows unsupervised STDP learning. We then pivot to show that Hebbian updating in traditional ANN's can also perform this task (albeit possibly not as efficient as an SNN implementation). We close with some comments about the possible future of SNN's. Interactive widgets referenced in this lecture can be found at:
- Spiking Neural Network Explorer: https://tpavlic.github.io/asu-bioinspired-ai-and-optimization/spiking_neural_networks/snn_explorer.html
- Memristor Crossbar Array Unsupervised STDP Learning (with Latent Inhibition): https://tpavlic.github.io/asu-bioinspired-ai-and-optimization/memristors/memristor_stdp_array.html
- ANN Unsupervised STDP Learning (with Latent Inhibition): https://tpavlic.github.io/asu-bioinspired-ai-and-optimization/hebbian_learning/hebbian_competitive_clustering.html
No comments:
Post a Comment