Thursday, April 2, 2026

Lecture 5E/6A (2026-04-04): Parallel Tempering and Swarm Intelligence through Social Cohesion (Particle Swarm Optimization)

In this lecture, we finish our unit on physics-inspired ML and optimization by covering Parallel Tempering (PT), which combines multiple, parallel Metropolis–Hastings MCMC samplers each with different temperatures (rather than using an annealing schedule, as in Simulated Annealing (SA)). We then pivot toward motivating why certain problem sets, like optimizing high-dimensional weights of neural networks, may not be well suited by the optimization metaheuristics discussed so far in the course. We use this as an opportunity to introduce Swarm Intelligence and the Particle Swarm Optimization (PSO) algorithm, which is particularly good at finding and exploring local optima in spaces with many similarly performing local optima. We explore how PSO was inspired by the Boids Model from Craig Reynolds (in computer graphics) and how it overlaps with the Vicsek model (from statistical physics). We also show how PSO really depends on is social information but, under the influence of social information, tends to very quickly purge the diversity in its solution candidates. Online interactive demonstration modules associated with this lecture can be found at:

Whiteboard notes for this lecture can be found at: https://www.dropbox.com/scl/fi/7jwuytadieywwilqazjq5/IEE598-Lecture5E_6A-2026-04-02-Parallel_Tempering_and_Particle_Swarm_Optimization-Notes.pdf?rlkey=p1pr7cs241okovkgjnevvhdp5&dl=0



No comments:

Post a Comment

Popular Posts