Self-Adaptation Evolution Strategy (SAES)
- class pypop7.optimizers.es.saes.SAES(problem, options)[source]
Self-Adaptation Evolution Strategy (SAES).
Note
SAES adapts only the global step-size on-the-fly with a relatively small population, often resulting in slow (and even premature) convergence for large-scale black-box optimization (LBO), especially on ill-conditioned fitness landscapes. Therefore, it is recommended to first attempt more advanced ES variants (e.g. LMCMA, LMMAES) for LBO. Here we include SAES mainly for benchmarking and theoretical purpose.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size, aka mutation strength (float),
’mean’ - initial (starting) point, aka mean of Gaussian search distribution (array_like),
if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].
’n_individuals’ - number of offspring, aka offspring population size (int, default: 4 + int(3*np.log(problem[‘ndim_problem’]))),
’n_parents’ - number of parents, aka parental population size (int, default: int(options[‘n_individuals’]/2)),
’lr_sigma’ - learning rate of global step-size (float, default: 1.0/np.sqrt(2*problem[‘ndim_problem’])).
Examples
Use the black-box optimizer SAES to minimize the well-known test function Rosenbrock:
1>>> import numpy # engine for numerical computing 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.es.saes import SAES 4>>> problem = {'fitness_function': rosenbrock, # to define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5.0*numpy.ones((2,)), 7... 'upper_boundary': 5.0*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # to set optimizer options 9... 'seed_rng': 2022, 10... 'mean': 3.0*numpy.ones((2,)), 11... 'sigma': 3.0} # global step-size may need to be tuned 12>>> saes = SAES(problem, options) # to initialize the optimizer class 13>>> results = saes.optimize() # to run the optimization/evolution process 14>>> # to return the number of function evaluations and the best-so-far fitness 15>>> print(f"SAES: {results['n_function_evaluations']}, {results['best_so_far_y']}") 16SAES: 5000, 0.012622712890954227
For its correctness checking of coding, refer to this code-based repeatability report for more details.
- best_so_far_x
final best-so-far solution found during entire optimization.
- Type:
array_like
- best_so_far_y
final best-so-far fitness found during entire optimization.
- Type:
array_like
- lr_sigma
learning rate of global step-size adaptation.
- Type:
float
- mean
initial (starting) point, aka mean of Gaussian search distribution.
- Type:
array_like
- n_individuals
number of offspring, aka offspring population size.
- Type:
int
- n_parents
number of parents, aka parental population size.
- Type:
int
- sigma
final global step-size, aka mutation strength (changed during optimization).
- Type:
float
References
Beyer, H.G., 2020, July. Design principles for matrix adaptation evolution strategies. In Proceedings of ACM Conference on Genetic and Evolutionary Computation Companion (pp. 682-700). ACM.
http://www.scholarpedia.org/article/Evolution_strategies
See its official Matlab/Octave version from Prof. Beyer: https://homepages.fhv.at/hgb/downloads/mu_mu_I_lambda-ES.oct