# Exponential Natural Evolution Strategies (XNES)¶

class pypop7.optimizers.nes.xnes.XNES(problem, options)

Exponential Natural Evolution Strategies (XNES).

Parameters:
• problem (dict) –

problem arguments with the following common settings (keys):
• ’fitness_function’ - objective function to be minimized (func),

• ’ndim_problem’ - number of dimensionality (int),

• ’upper_boundary’ - upper boundary of search range (array_like),

• ’lower_boundary’ - lower boundary of search range (array_like).

• options (dict) –

optimizer options with the following common settings (keys):
• ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

• ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

• ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

and with the following particular settings (keys):
• ’n_individuals’ - number of offspring/descendants, aka offspring population size (int),

• ’n_parents’ - number of parents/ancestors, aka parental population size (int),

• ’mean’ - initial (starting) point (array_like),

• if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

• ’sigma’ - initial global step-size, aka mutation strength (float).

Examples

Use the optimizer to minimize the well-known test function Rosenbrock:

``` 1>>> import numpy
2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
3>>> from pypop7.optimizers.nes.xnes import XNES
4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
5...            'ndim_problem': 2,
6...            'lower_boundary': -5*numpy.ones((2,)),
7...            'upper_boundary': 5*numpy.ones((2,))}
8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
9...            'seed_rng': 2022,
10...            'mean': 3*numpy.ones((2,)),
11...            'sigma': 0.1}  # the global step-size may need to be tuned for better performance
12>>> xnes = XNES(problem, options)  # initialize the optimizer class
13>>> results = xnes.optimize()  # run the optimization process
14>>> # return the number of function evaluations and best-so-far fitness
15>>> print(f"XNES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
16XNES: 5000, 1.3565728021697798e-18
```
lr_cv

learning rate of covariance matrix adaptation.

Type:

float

lr_sigma

learning rate of global step-size adaptation.

Type:

float

mean

initial (starting) point, aka mean of Gaussian search/sampling/mutation distribution.

Type:

array_like

n_individuals

number of offspring/descendants, aka offspring population size.

Type:

int

n_parents

number of parents/ancestors, aka parental population size.

Type:

int

sigma

global step-size, aka mutation strength (i.e., overall std of Gaussian search distribution).

Type:

float

References

Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J. and Schmidhuber, J., 2014. Natural evolution strategies. Journal of Machine Learning Research, 15(1), pp.949-980. https://jmlr.org/papers/v15/wierstra14a.html

Schaul, T., 2011. Studies in continuous black-box optimization. Doctoral Dissertation, Technische Universität München. https://people.idsia.ch/~schaul/publications/thesis.pdf

Glasmachers, T., Schaul, T., Yi, S., Wierstra, D. and Schmidhuber, J., 2010, July. Exponential natural evolution strategies. In Proceedings of Annual Conference on Genetic and Evolutionary Computation (pp. 393-400). https://dl.acm.org/doi/abs/10.1145/1830483.1830557

See the official Python source code from PyBrain: https://github.com/pybrain/pybrain/blob/master/pybrain/optimization/distributionbased/xnes.py