Exact Natural Evolution Strategy (ENES)

class pypop7.optimizers.nes.enes.ENES(problem, options)[source]

Exact Natural Evolution Strategy (ENES).

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular settings (keys):
    • ’n_individuals’ - number of offspring/descendants, aka offspring population size (int),

    • ’n_parents’ - number of parents/ancestors, aka parental population size (int),

    • ’mean’ - initial (starting) point (array_like),

      • If not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

    • ’lr_mean’ - learning rate of distribution mean update (float, default: 1.0),

    • ’lr_sigma’ - learning rate of global step-size adaptation (float, default: 1.0).

Examples

Use the optimizer ENES to minimize the well-known test function Rosenbrock:

 1>>> import numpy  # engine for numerical computing
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.nes.enes import ENES
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5*numpy.ones((2,)),
 7...            'upper_boundary': 5*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022,
10...            'mean': 3*numpy.ones((2,))}
11>>> enes = ENES(problem, options)  # initialize the optimizer class
12>>> results = enes.optimize()  # run the optimization process
13>>> # return the number of function evaluations and best-so-far fitness
14>>> print(f"ENES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
15ENES: 5000, 0.00035668252927080496
lr_mean

learning rate of distribution mean update (should > 0.0).

Type:

float

lr_sigma

learning rate of global step-size adaptation (should > 0.0).

Type:

float

mean

initial (starting) point, aka mean of Gaussian search/sampling/mutation distribution. If not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’], by default.

Type:

array_like

n_individuals

number of offspring/descendants, aka offspring population size (should > 0).

Type:

int

n_parents

number of parents/ancestors, aka parental population size (should > 0).

Type:

int

References

Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J. and Schmidhuber, J., 2014. Natural evolution strategies. Journal of Machine Learning Research, 15(1), pp.949-980.

Schaul, T., 2011. Studies in continuous black-box optimization. Doctoral Dissertation, Technische Universität München.

Yi, S., Wierstra, D., Schaul, T. and Schmidhuber, J., 2009, June. Stochastic search using the natural gradient. In International Conference on Machine Learning (pp. 1161-1168). ACM.

Please refer to the official Python source code from PyBrain (now not actively maintained): https://github.com/pybrain/pybrain/blob/master/pybrain/optimization/distributionbased/nes.py