Schwefel’s Self-Adaptation Evolution Strategy (SSAES)

class pypop7.optimizers.es.ssaes.SSAES(problem, options)[source]

Schwefel’s Self-Adaptation Evolution Strategy (SSAES).

Note

SSAES adapts all the individual step-sizes on-the-fly, proposed by Schwefel (one recipient of IEEE Evolutionary Computation Pioneer Award 2002 and IEEE Frank Rosenblatt Award 2011). Since it often needs a relatively large population (e.g., larger than number of dimensionality) for reliable self-adaptation, SSAES suffers easily from slow convergence for large-scale black-box optimization (LBO). Therefore, it is recommended to first attempt more advanced ES variants (e.g., LMCMA, LMMAES) for LBO. Here we include SSAES mainly for benchmarking and theoretical purpose. Currently the restart process is not implemented owing to its typically slow convergence.

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular settings (keys):
    • ’sigma’ - initial global step-size, aka mutation strength (float),

    • ’mean’ - initial (starting) point, aka mean of Gaussian search distribution (array_like),

      • if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

    • ’n_individuals’ - number of offspring, aka offspring population size (int, default: 5*problem[‘ndim_problem’]),

    • ’n_parents’ - number of parents, aka parental population size (int, default: int(options[‘n_individuals’]/4)),

    • ’lr_sigma’ - learning rate of global step-size self-adaptation (float, default: 1.0/np.sqrt(problem[‘ndim_problem’])),

    • ’lr_axis_sigmas’ - learning rate of individual step-sizes self-adaptation (float, default: 1.0/np.power(problem[‘ndim_problem’], 1.0/4.0)).

Examples

Use the black-box optimizer SSAES to minimize the well-known test function Rosenbrock:

 1>>> import numpy  # engine for numerical computing
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.es.ssaes import SSAES
 4>>> problem = {'fitness_function': rosenbrock,  # to define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5.0*numpy.ones((2,)),
 7...            'upper_boundary': 5.0*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # to set optimizer options
 9...            'seed_rng': 2022,
10...            'mean': 3.0*numpy.ones((2,)),
11...            'sigma': 3.0}  # global step-size may need to be tuned
12>>> ssaes = SSAES(problem, options)  # to initialize the black-box optimizer class
13>>> results = ssaes.optimize()  # to run the optimization/evolution process
14>>> # to return the number of function evaluations and the best-so-far fitness
15>>> print(f"SSAES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
16SSAES: 5000, 0.00023558230456829403

For its correctness checking of coding, refer to this code-based repeatability report for more details.

best_so_far_x

final best-so-far solution found during entire optimization.

Type:

array_like

best_so_far_y

final best-so-far fitness found during entire optimization.

Type:

array_like

lr_axis_sigmas

learning rate of individual step-sizes self-adaptation.

Type:

float

lr_sigma

learning rate of global step-size self-adaptation.

Type:

float

mean

initial (starting) point, aka mean of Gaussian search distribution.

Type:

array_like

n_individuals

number of offspring, aka offspring population size.

Type:

int

n_parents

number of parents, aka parental population size.

Type:

int

sigma

initial global step-size, aka mutation strength.

Type:

float

_axis_sigmas

final individuals step-sizes (updated during optimization).

Type:

array_like

References

Hansen, N., Arnold, D.V. and Auger, A., 2015. Evolution strategies. In Springer Handbook of Computational Intelligence (pp. 871-898). Springer, Berlin, Heidelberg.

Beyer, H.G. and Schwefel, H.P., 2002. Evolution strategies–A comprehensive introduction. Natural Computing, 1(1), pp.3-52.

Schwefel, H.P., 1988. Collective intelligence in evolving systems. In Ecodynamics (pp. 95-100). Springer, Berlin, Heidelberg.

Schwefel, H.P., 1984. Evolution strategies: A family of non-linear optimization techniques based on imitating some principles of organic evolution. Annals of Operations Research, 1(2), pp.165-167.