Limited Memory Covariance Matrix Adaptation Evolution Strategy (LMCMAES)¶
- class pypop7.optimizers.es.lmcmaes.LMCMAES(problem, options)¶
Limited-Memory Covariance Matrix Adaptation Evolution Strategy (LMCMAES).
Note
For perhaps better performance, please use its lateset version called LMCMA. Here we include it mainly for benchmarking purpose.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size, aka mutation strength (float),
’mean’ - initial (starting) point, aka mean of Gaussian search distribution (array_like),
if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].
’m’ - number of direction vectors (int, default: 4 + int(3*np.log(problem[‘ndim_problem’]))),
’n_steps’ - target number of generations between vectors (int, default: options[‘m’]),
’c_c’ - learning rate for evolution path update (float, default: 1.0/options[‘m’]).
’c_1’ - learning rate for covariance matrix adaptation (float, default: 1.0/(10.0*np.log(problem[‘ndim_problem’] + 1.0))),
’c_s’ - learning rate for population success rule (float, default: 0.3),
’d_s’ - delay rate for population success rule (float, default: 1.0),
’z_star’ - target success rate for population success rule (float, default: 0.25),
’n_individuals’ - number of offspring, aka offspring population size (int, default: 4 + int(3*np.log(problem[‘ndim_problem’]))),
’n_parents’ - number of parents, aka parental population size (int, default: int(options[‘n_individuals’]/2)).
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.es.lmcmaes import LMCMAES 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5*numpy.ones((2,)), 7... 'upper_boundary': 5*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # set optimizer options 9... 'seed_rng': 2022, 10... 'mean': 3*numpy.ones((2,)), 11... 'sigma': 0.1} # the global step-size may need to be tuned for better performance 12>>> lmcmaes = LMCMAES(problem, options) # initialize the optimizer class 13>>> results = lmcmaes.optimize() # run the optimization process 14>>> # return the number of function evaluations and best-so-far fitness 15>>> print(f"LMCMAES: {results['n_function_evaluations']}, {results['best_so_far_y']}") 16LMCMAES: 5000, 4.590739937885748e-16
For its correctness checking of coding, refer to this code-based repeatability report for more details.
- c_c¶
learning rate for evolution path update.
- Type:
float
- c_s¶
learning rate for population success rule.
- Type:
float
- c_1¶
learning rate for covariance matrix adaptation.
- Type:
float
- d_s¶
delay rate for population success rule.
- Type:
float
- m¶
number of direction vectors.
- Type:
int
- mean¶
initial (starting) point, aka mean of Gaussian search distribution.
- Type:
array_like
- n_individuals¶
number of offspring, aka offspring population size.
- Type:
int
- n_parents¶
number of parents, aka parental population size.
- Type:
int
- n_steps¶
target number of generations between vectors.
- Type:
int
- sigma¶
initial global step-size, aka mutation strength.
- Type:
float
- z_star¶
target success rate for population success rule.
- Type:
float
References
Loshchilov, I., 2014, July. A computationally efficient limited memory CMA-ES for large scale optimization. In Proceedings of Annual Conference on Genetic and Evolutionary Computation (pp. 397-404). ACM. https://dl.acm.org/doi/abs/10.1145/2576768.2598294
See the official C++ version from Loshchilov: https://sites.google.com/site/lmcmaeses/