In this lecture, we continue our discussion of computational/numerical methods inspired by statistical mechanics (i.e., physics). We start with a reminder of our ultimate goal to develop a method of simulated annealing that can be used as an optimization metaheuristic. That brings us back to entropy, where we ended last time, and lets us introduce maximum entropy (MaxEnt) methods. Maximum Entropy distributions are used in a wide variety of application areas (consider uniform distributions, exponential distributions, and normal distributions), and "MaxEnt methods" became popularized as a tool by computer scientists working in Natural Language Processing (NLP) and are now popular in archaeology and ecology (among other places). Equipped with an understanding of entropy and maximum-entropy distributions, we move on to discussing the Boltzmann distribution across microstates and how it relates to the exponential distribution (over energy). This lets us setup a likelihood ratio that will be at the core of the "Metropolis algorithm" for MCMC sampling to solve numerical integration problems (i.e., to solve high-dimensional "equations of state" in physical chemistry). We will pick up with MCMC and the Metropolis algorithm next time and then swing through Metropolis–Hastings and eventually reach Simulated Annealing (SA).
Whiteboard notes for this lecture can be found at: https://www.dropbox.com/s/mrpmqd6msvvycml/IEE598-Lecture5B-2022-02-24-From_MaxEnt_Methods_to_MCMC_Sampling.pdf?dl=0