Thursday, March 24, 2022

Lecture 7A (2022-03-24): Introduction to Neural Networks

In this lecture, we introduce artificial neural networks (ANN's) and the neurobiological foundations that inspired them. We start with a description of the multipolar neuron, with many synapses and one axon, and focus on chemical synapses between axons and dendrites. We cover resting potential, synaptic potentials, and (traveling/propagating) action potentials. We then transition to the simple artificial neuron model (the basis of modern ANN's) as a function of a weighted sum of inputs and a bias term. The artificial neuron is portrayed as an alternative representation of generalized linear modeling (GLM) from statistics, with the activation function playing a similar role to the link function in GLM. We then discuss several common activation functions -- Heaviside (threshold), linear, rectified linear (ReLU), logistic (sinusoid), and hyberbolic tangent (tanh). We will pick up next time seeing how the single artificial neuron can be used for binary classification in linearly separable feature spaces, with a geometric interpretation of the weight vector as a line separating two classes of feature vectors.

Whiteboard notes for this lecture can be found at: https://www.dropbox.com/s/vlrmqwatkerl7hf/IEE598-Lecture7A-2022-03-24-Introductoin_to_Neural_Networks.pdf?dl=0



No comments:

Post a Comment

Popular Posts