Novel Approaches to Creating Robust Globally Convergent Algorithms for Numerical Optimization
Type of Degreethesis
MetadataShow full item record
Two optimization algorithms are presented, each of which seeks to effectively combine the desirable characteristics of gradient descent and evolutionary computation into a single robust algorithm. The first method, termed Quasi-Gradient Directed Migration (QGDM), is based on a quasi-gradient approach which utilizes the directed migration of a bounded set of randomly distributed points. The algorithm progresses by iteratively updating the center location and radius of the given population. The second method, while similar in spirit, takes this concept one step further by using a "variable scale gradient approximation" (VSGA), which allows it to recognize surface behavior on both the large and small scale. By adjusting the population radius between iterations, the algorithm is able to escape local minima by shifting its focus onto global trends rather than local behavior. Both algorithms are compared experimentally with existing methods and are shown to be competitive, if not superior, in each of the tested cases.