In this lecture, we continue our review of artificial neural networks (ANN's) and some of the nature-inspired directions that inspired them. We move from feed-forward multi-layer perceptrons to more general topologies that skip layers or add recurrent connections (as in Recurrent Neural Networks, RNN's). This provides a brief introduction to Time Delayed Neural Networks (TDNN) and an opportunity to make connections between Finite Impulse Response (FIR) and Infinite Impulse Response (IIR) filtering from linear systems theory and signal processing. We then discuss how a generalized version of backpropagation, known as backpropagation thru time (BPTT) can be used to train neural networks in much the same was as multi-layered perceptrons. We then discuss a specialized RNN known as Long Short-Term Memory (LSTM) and its motivations, and we close with a brief introduction to reservoir computing ("reservoir machines", "Echo-State Networks", ESN's), which only require a small subset of the total network to be trained and yet can do a wide variety of the tasks that other "fully trained" RNN's can do.
Whiteboard notes for this lecture can be found at: https://www.dropbox.com/s/pa5ju9pxja38zxv/IEE598-Lecture7D-Notes.pdf?dl=0
No comments:
Post a Comment