Tuesday, April 22, 2025

Lecture 7E (2025-04-22): Learning without a Teacher – Unsupervised and Self-Supervised Learning

This lecture covers unsupervised and self-supervised learning, focusing on how both brains and machines discover structure without external labels or rewards (akin to non-associative learning). It begins with examples of unsupervised learning, including clustering, principal component analysis, and autoencoders, and then explores how biological systems like the olfactory pathway in insects organize complex sensory input into compressed, low-dimensional codes. We take a detailed look at the structure of the honeybee brain, examining how floral odors are transformed through the antennal lobe’s glomerular code into organized neural representations. We then transition into self-supervised learning (akin to latent learning) by introducing predictive coding and sensorimotor prediction, highlighting how brains use internal models to anticipate and correct sensory input. Finally, we close by discussing how modern AI systems like GPT (and BERT) leverage self-supervised objectives to build rich internal representations from raw data.

Whiteboard notes for this lecture can be found at:
https://www.dropbox.com/scl/fi/qwezfleqplmxtiobfpoew/IEE598-Lecture7E-2025-04-22-Learning_without_a_Teacher-Unsupervised_and_Self-Supervised_Learning-Notes.pdf?rlkey=4k5o8j8no3s9x7xc5di676qz3&dl=0





No comments:

Post a Comment

Popular Posts