In this lecture, we complete our coverage of the Genetic Algorithm (GA) by discussing how to improve the function of selection operators and, in general, how to tune hyperparameters to improve the performance of the GA for a given problem. We start with a discussion of the effects of Stochastic Uniform Sampling (SUS) over roulette-wheel selection and how the effective drop in variance in number of parents eliminates the fixation-causing effects of drift while also continuing to leave a barrier on precision in place. We also discuss how to use exponential ranking in ranking selection to have better control over selective pressure, but we mention that tournament selection ultimately is a stronger choice computationally when rank-based selection is desired. We discuss a framework that puts the 5 major hyperparameters (M, R, E, Pm, and Pc [as well as selection pressure]) on one graph to help guide choice of different hyperparameters based on context. We draw connections between the two types of selection operator (fitness-proportionate and rank based) and Generalized Linear Modeling (GLM; continuous and ordinal response variables) and discuss connections between the number of parents and the number of samples/statistical power in a GLM. Finally, we close with a brief introduction to Evolution Strategies (ES), which will be the topic we will start with in the next unit.
Whiteboard notes for this lecture can be found at:
https://www.dropbox.com/scl/fi/8z1we6jycmealo2ww2ik0/IEE598-Lecture1H-2026-02-05-GA_Hyperparameter_Tuning-Notes.pdf?rlkey=gosm6672x8v9c66zdf6ho09mb&dl=0
No comments:
Post a Comment