Monday, March 2, 2020

Lecture 5C: From MCMC Sampling to Optimization by Simulated Annealing (2020-03-02)

In this lecture, we start with an outline of the Metropolis–Hastings (MH) Markov Chain Monte Carlo (MCMC) sampling algorithm and describe how instantiating it with the Boltzmann–Gibbs distribution (which turns it into the classical Metropolis algorithm) is equivalent to sampling from an abstract energy landscape in the "least biased" way given a kinetic energy budget. We then discuss how the acceptance likelihood ratio allows for this (and other) MCMC algorithms to be used in applications where the desired relative frequency of outcomes is known but the normalization constant usually needed for a probability density is cumbersome to calculate. From there, we move from MH to Simulated Annealing (SA), which is a special case of the Metropolis algorithm with an annealing schedule. SA sets a temperature and then samples in the "least biased" (maximum entropy) way for that temperature before decreasing temperature, in the hopes of eventually finding the minimum of an optimization objective.

Whiteboard notes from this lecture can be found at: https://www.dropbox.com/s/hprzvgrshg755ge/IEE598-Lecture5C-Notes.pdf?dl=0

No comments:

Post a Comment

Popular Posts