Fast Evolutionary Programming (FEP)¶
- class pypop7.optimizers.ep.fep.FEP(problem, options)¶
Fast Evolutionary Programming with self-adaptive mutation (FEP).
Note
FEP was proposed mainly by Yao in 1999, the recipient of IEEE Evolutionary Computation Pioneer Award 2013 and IEEE Frank Rosenblatt Award 2020, where the classical Gaussian sampling distribution is replaced by the heavy-tailed Cachy distribution for better exploration on multi-modal black-box problems.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size, aka mutation strength (float),
’n_individuals’ - number of offspring, aka offspring population size (int, default: 100),
’q’ - number of opponents for pairwise comparisons (int, default: 10),
’tau’ - learning rate of individual step-sizes self-adaptation (float, default: 1.0/np.sqrt(2.0*np.sqrt(problem[‘ndim_problem’]))),
’tau_apostrophe’ - learning rate of individual step-sizes self-adaptation (float, default: 1.0/np.sqrt(2.0*problem[‘ndim_problem’]).
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.ep.fep import FEP 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5*numpy.ones((2,)), 7... 'upper_boundary': 5*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # set optimizer options 9... 'seed_rng': 2022, 10... 'sigma': 0.1} 11>>> fep = FEP(problem, options) # initialize the optimizer class 12>>> results = fep.optimize() # run the optimization process 13>>> # return the number of function evaluations and best-so-far fitness 14>>> print(f"FEP: {results['n_function_evaluations']}, {results['best_so_far_y']}") 15FEP: 5000, 0.20127155821817022
For its correctness checking, refer to this code-based repeatability report for more details.
- n_individuals¶
number of offspring, aka offspring population size.
- Type:
int
- q¶
number of opponents for pairwise comparisons.
- Type:
int
- sigma¶
initial global step-size, aka mutation strength.
- Type:
float
- tau¶
learning rate of individual step-sizes self-adaptation.
- Type:
float
- tau_apostrophe¶
learning rate of individual step-sizes self-adaptation.
- Type:
float
References
Yao, X., Liu, Y. and Lin, G., 1999. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation, 3(2), pp.82-102. https://ieeexplore.ieee.org/abstract/document/771163
Bäck, T. and Schwefel, H.P., 1993. An overview of evolutionary algorithms for parameter optimization. Evolutionary Computation, 1(1), pp.1-23. https://direct.mit.edu/evco/article-abstract/1/1/1/1092/An-Overview-of-Evolutionary-Algorithms-for