Separable Covariance Matrix Adaptation Evolution Strategy (SEPCMAES)

class pypop7.optimizers.es.sepcmaes.SEPCMAES(problem, options)

Separable Covariance Matrix Adaptation Evolution Strategy (SEPCMAES).

Note

SEPCMAES learns only the diagonal elements of the full covariance matrix explicitly, leading to a linear time complexity (w.r.t. each sampling) for large-scale black-box optimization (LSBBO). It is highly recommended to first attempt more advanced ES variants (e.g. LMCMA, LMMAES) for LSBBO, since the performance of SEPCMAES deteriorates significantly on nonseparable, ill-conditioned fitness landscape.

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular settings (keys):
    • ’sigma’ - initial global step-size, aka mutation strength (float),

    • ’mean’ - initial (starting) point, aka mean of Gaussian search distribution (array_like),

      • if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

    • ’n_individuals’ - number of offspring, aka offspring population size (int, default: 4 + int(3*np.log(options[‘ndim_problem’]))),

    • ’n_parents’ - number of parents, aka parental population size (int, default: int(options[‘n_individuals’]/2)),

    • ’c_c’ - learning rate of evolution path update (float, default: 4.0/(options[‘ndim_problem’] + 4.0)).

Examples

Use the optimizer to minimize the well-known test function Rosenbrock:

 1>>> import numpy
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.es.sepcmaes import SEPCMAES
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5*numpy.ones((2,)),
 7...            'upper_boundary': 5*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022,
10...            'mean': 3*numpy.ones((2,)),
11...            'sigma': 0.1}  # the global step-size may need to be tuned for better performance
12>>> sepcmaes = SEPCMAES(problem, options)  # initialize the optimizer class
13>>> results = sepcmaes.optimize()  # run the optimization process
14>>> # return the number of function evaluations and best-so-far fitness
15>>> print(f"SEPCMAES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
16SEPCMAES: 5000, 0.0028541286223351006

For its correctness checking of coding, refer to this code-based repeatability report for more details.

c_c

learning rate of evolution path update.

Type:

float

mean

initial (starting) point, aka mean of Gaussian search distribution.

Type:

array_like

n_individuals

number of offspring, aka offspring population size.

Type:

int

n_parents

number of parents, aka parental population size.

Type:

int

sigma

final global step-size, aka mutation strength.

Type:

float

References

Ros, R. and Hansen, N., 2008, September. A simple modification in CMA-ES achieving linear time and space complexity. In International Conference on Parallel Problem Solving from Nature (pp. 296-305). Springer, Berlin, Heidelberg. https://link.springer.com/chapter/10.1007/978-3-540-87700-4_30