Employing Supportive Coevolution for the Automated Design and Configuration of Evolutionary Algorithm Operators and Parameters
Type of DegreeMaster's Thesis
Computer Science and Software Engineering
MetadataShow full item record
Evolutionary Algorithms (EAs) can be highly complex structures requiring many operators and parameters to function. Since EA performance can be highly dependent on these parameters and operators, creating automatic methods for EA parameter and operator selection/design could provide significant performance improvements while reducing the amount of manual configuration required to implement/configure an EA. EA design/configuration is further complicated by the fact that optimal parameter and operator settings can change throughout the run of an EA. Thus, a method that can also adapt parameters and operators on the fly during the evolutionary run could provide even further performance improvements. The research reported in this thesis leverages a technique called Supportive Coevolution (SuCo) to automate the configuration and control of mutation step size, crossover operators, and local optimization operators. Several experiments focused on floating point optimization are presented which show promising results using SuCo to configure and control both mutation step size and crossover operators. SuCo is then used to evolve local optimization operators for the local learning step of a Memetic Algorithm (MA) in what is called SuCo-MA. SuCo-MA is then extended by employing a diffusion model to encourage the evolution of deme specific local optimization strategies in SuCo-Dif-MA. This method is then applied to the Traveling Thief Problem (TTP) as a final empirical study of its effectiveness on modern, complex problems. Experimental design and empirical results are presented for all algorithm configurations. Comparisons are made for performance analysis and some points of interest are also discussed. Conclusions are presented based on the observations made during experimentation. Empirical results from benchmark testing showed that SuCo can improve performance when compared to both static parameters and operators and self-adapted parameters. SuCo-MA showed promising results on a variety of floating point benchmark functions. It was also shown that SuCo-Dif-MA can be used to generate competitive results when compared to state of the art techniques on TTP.