Separable Natural Evolution Strategies (SNES)¶
- class pypop7.optimizers.nes.snes.SNES(problem, options)¶
Separable Natural Evolution Strategies (SNES).
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’n_individuals’ - number of offspring/descendants, aka offspring population size (int),
’n_parents’ - number of parents/ancestors, aka parental population size (int),
’mean’ - initial (starting) point (array_like),
if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].
’sigma’ - initial global step-size, aka mutation strength (float).
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.nes.snes import SNES 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5*numpy.ones((2,)), 7... 'upper_boundary': 5*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # set optimizer options 9... 'seed_rng': 2022, 10... 'mean': 3*numpy.ones((2,)), 11... 'sigma': 0.1} # the global step-size may need to be tuned for better performance 12>>> snes = SNES(problem, options) # initialize the optimizer class 13>>> results = snes.optimize() # run the optimization process 14>>> # return the number of function evaluations and best-so-far fitness 15>>> print(f"SNES: {results['n_function_evaluations']}, {results['best_so_far_y']}") 16SNES: 5000, 0.49730042657448875
- lr_cv¶
learning rate of covariance matrix adaptation.
- Type:
float
- mean¶
initial (starting) point, aka mean of Gaussian search/sampling/mutation distribution.
- Type:
array_like
- n_individuals¶
number of offspring/descendants, aka offspring population size.
- Type:
int
- n_parents¶
number of parents/ancestors, aka parental population size.
- Type:
int
- sigma¶
global step-size, aka mutation strength (i.e., overall std of Gaussian search distribution).
- Type:
float
References
Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J. and Schmidhuber, J., 2014. Natural evolution strategies. Journal of Machine Learning Research, 15(1), pp.949-980. https://jmlr.org/papers/v15/wierstra14a.html
Schaul, T., 2011. Studies in continuous black-box optimization. Doctoral Dissertation, Technische Universität München. https://people.idsia.ch/~schaul/publications/thesis.pdf
Schaul, T., Glasmachers, T. and Schmidhuber, J., 2011, July. High dimensions and heavy tails for natural evolution strategies. In Proceedings of Annual Conference on Genetic and Evolutionary Computation (pp. 845-852). ACM. https://dl.acm.org/doi/abs/10.1145/2001576.2001692
See the official Python source code from PyBrain: https://github.com/pybrain/pybrain/blob/master/pybrain/optimization/distributionbased/snes.py