Tuesday, April 5, 2022

Lecture 7D (2022-04-05): CNNs, Insect Brains, More Complex Neural Networks (TDNNs and RNNs)

In this lecture, we complete our discussion of how backpropagation allows for gradient descent to train deep neural networks (feed-forward, multi-layer perceptrons in general). We then pivot to talking about more regularized feed-forward neural networks, like convolutional neural networks (CNNs) that combine convolutional layers with pooling layers and thereby simplify training while producing a low-dimensional feature set (relative to the dimensions of the input). After a brief discussion of the feed-forward architecture of the insect/pancrustacean brain, we shift to discussing time-delay neural networks (TDNNs) as an entry point into discussing recurrent neural networks (RNNs) and reservoir machines, which we will pick up on next time.



No comments:

Post a Comment

Popular Posts