Covariance Matrix Adaptation Evolution Strategy (CMAES)

class pypop7.optimizers.es.cmaes.CMAES(problem, options)

Covariance Matrix Adaptation Evolution Strategy (CMAES).

Note

CMAES is widely recognized as one of the State Of The Art (SOTA) for black-box optimization, according to the latest Nature review of Evolutionary Computation.

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular settings (keys):
    • ’sigma’ - initial global step-size, aka mutation strength (float),

    • ’mean’ - initial (starting) point, aka mean of Gaussian search distribution (array_like),

      • if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

    • ’n_individuals’ - number of offspring, aka offspring population size (int, default: 4 + int(3*np.log(problem[‘ndim_problem’]))),

    • ’n_parents’ - number of parents, aka parental population size (int, default: int(options[‘n_individuals’]/2)).

Examples

Use the optimizer to minimize the well-known test function Rosenbrock:

 1>>> import numpy
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.es.cmaes import CMAES
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5*numpy.ones((2,)),
 7...            'upper_boundary': 5*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022,
10...            'is_restart': False,
11...            'mean': 3*numpy.ones((2,)),
12...            'sigma': 0.1}  # the global step-size may need to be tuned for better performance
13>>> cmaes = CMAES(problem, options)  # initialize the optimizer class
14>>> results = cmaes.optimize()  # run the optimization process
15>>> # return the number of function evaluations and best-so-far fitness
16>>> print(f"CMAES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
17CMAES: 5000, 9.11305771685916e-09

For its correctness checking of coding, refer to this code-based repeatability report for more details.

mean

initial (starting) point, aka mean of Gaussian search distribution.

Type:

array_like

n_individuals

number of offspring, aka offspring population size / sample size.

Type:

int

n_parents

number of parents, aka parental population size / number of positively selected search points.

Type:

int

sigma

final global step-size, aka mutation strength.

Type:

float

References

Hansen, N., 2023. The CMA evolution strategy: A tutorial. arXiv preprint arXiv:1604.00772. https://arxiv.org/abs/1604.00772

Hansen, N., Müller, S.D. and Koumoutsakos, P., 2003. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1), pp.1-18. https://direct.mit.edu/evco/article-abstract/11/1/1/1139/Reducing-the-Time-Complexity-of-the-Derandomized

Hansen, N. and Ostermeier, A., 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2), pp.159-195. https://direct.mit.edu/evco/article-abstract/9/2/159/892/Completely-Derandomized-Self-Adaptation-in

Hansen, N. and Ostermeier, A., 1996, May. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation (pp. 312-317). IEEE. https://ieeexplore.ieee.org/abstract/document/542381

See the lightweight implementation of CMA-ES from cyberagent.ai: https://github.com/CyberAgentAILab/cmaes