Fast Evolutionary Programming (FEP)
- class pypop7.optimizers.ep.fep.FEP(problem, options)[source]
Fast Evolutionary Programming with self-adaptive mutation of individual step-sizes (FEP).
Note
FEP was proposed mainly by Yao et al. in 1999 (the recipient of IEEE Evolutionary Computation Pioneer Award 2013 and IEEE Frank Rosenblatt Award 2020 ), where the classical Gaussian sampling distribution is replaced by the heavy-tailed Cachy distribution for better exploration on multi-modal black-box optimization problems.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size, aka mutation strength (float),
’n_individuals’ - number of offspring, aka offspring population size (int, default: 100),
’q’ - number of opponents for pairwise comparisons (int, default: 10),
’tau’ - learning rate of individual step-sizes self-adaptation (float, default: 1.0/np.sqrt(2.0*np.sqrt(problem[‘ndim_problem’]))),
’tau_apostrophe’ - learning rate of individual step-sizes self-adaptation (float, default: 1.0/np.sqrt(2.0*problem[‘ndim_problem’]).
Examples
Use the optimizer FEP to minimize the well-known test function Rosenbrock:
1>>> import numpy # engine for numerical computing 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.ep.fep import FEP 4>>> problem = {'fitness_function': rosenbrock, # to define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5.0*numpy.ones((2,)), 7... 'upper_boundary': 5.0*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # to set optimizer options 9... 'seed_rng': 2022, 10... 'sigma': 3.0} # global step-size may need to be tuned 11>>> fep = FEP(problem, options) # to initialize the optimizer class 12>>> results = fep.optimize() # to run its optimization/evolution process 13>>> # to return the number of function evaluations and the best-so-far fitness 14>>> print(f"FEP: {results['n_function_evaluations']}, {results['best_so_far_y']}") 15FEP: 5000, 0.005781004466936902
For its correctness checking, refer to this code-based repeatability report for more details.
- best_so_far_x
final best-so-far solution found during entire optimization.
- Type:
array_like
- best_so_far_y
final best-so-far fitness found during entire optimization.
- Type:
array_like
- n_individuals
number of offspring, aka offspring population size.
- Type:
int
- q
number of opponents for pairwise comparisons.
- Type:
int
- sigma
initial global step-size, aka mutation strength.
- Type:
float
- tau
self-adaptation learning rate of individual step-sizes.
- Type:
float
- tau_apostrophe
self-adaptation learning rate of individual step-sizes.
- Type:
float
References
Yao, X., Liu, Y. and Lin, G., 1999. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation, 3(2), pp.82-102.
Chellapilla, K. and Fogel, D.B., 1999. Evolution, neural networks, games, and intelligence. Proceedings of the IEEE, 87(9), pp.1471-1496.
Bäck, T. and Schwefel, H.P., 1993. An overview of evolutionary algorithms for parameter optimization. Evolutionary Computation, 1(1), pp.1-23.