EN FR
EN FR
New Software and Platforms
Bibliography
New Software and Platforms
Bibliography


Section: New Results

Mixture model CMA-ES

Participants : François Fages, Nicolas Vasselin.

In [19], we report on our attempt to improve the CMA-ES global optimization algorithm based on two ideas: first the use of Sobol's quasi-random low discrepancy numbers instead of pseudo-random numbers, second the design of a mixture model extension of CMA-ES (MM-CMA-ES) which, instead of doing restarts with an important loss of information at each restart, evolves a dynamic set of multivariate normal distributions in parallel, using an EM clustering algorithm at each step to decide of population splittings and mergings. On the standard Coco benchmark for evaluating global stochastic optimization methods, the use of Sobol numbers shows a quite uniform improvement by 30% as was already shown by Teytaud last year (O. Teytaud. Quasi-random numbers improve the CMA-ES on the BBOB testbed. Artificial Evolution (EA2015), 2015, Lyon, France. Springer Verlag, pp.13). On the other hand, MM-CMA-ES does not show speed-up w.r.t. CMA-ES with IBOP restart strategy, even on objective functions with many local minima such as the Rastragin function. The reason is the overhead in the number of evaluation of the objective functions, introduced by the MM strategy, and the very subtle effect of the adaptive step size strategy of CMA-ES to escape from the covering of several local minima by one (large) normal distribution.