Cooperative Coevolving Particle Swarm Optimizer (CCPSO2)¶
- class pypop7.optimizers.pso.ccpso2.CCPSO2(problem, options)¶
Cooperative Coevolving Particle Swarm Optimizer (CCPSO2).
Note
CCPSO2 employs the popular cooperative coevolution framework to extend PSO for large-scale black-box optimization (LSBBO) with random grouping/partitioning. However, it may suffer from performance degradation on non-separable functions (particularly ill-conditioned), owing to its axis-parallel decomposition strategy (see the classical coordinate descent from the mathematical programming community for detailed mathematical explanation).
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’n_individuals’ - swarm (population) size, aka number of particles (int, default: 30),
’p’ - probability of using Cauchy sampling distribution (float, default: 0.5),
’group_sizes’ - a pool of candidate dimensions for grouping (list, default: [2, 5, 10, 50, 100, 250]).
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.pso.ccpso2 import CCPSO2 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 500, 6... 'lower_boundary': -5*numpy.ones((500,)), 7... 'upper_boundary': 5*numpy.ones((500,))} 8>>> options = {'max_function_evaluations': 1000000, # set optimizer options 9... 'seed_rng': 2022} 10>>> ccpso2 = CCPSO2(problem, options) # initialize the optimizer class 11>>> results = ccpso2.optimize() # run the optimization process 12>>> # return the number of function evaluations and best-so-far fitness 13>>> print(f"CCPSO2: {results['n_function_evaluations']}, {results['best_so_far_y']}") 14CCPSO2: 1000000, 1150.0205163111475
For its correctness checking of coding, refer to this code-based repeatability report for more details.
- group_sizes¶
a pool of candidate dimensions for grouping.
- Type:
list
- n_individuals¶
swarm (population) size, aka number of particles.
- Type:
int
- p¶
probability of using Cauchy sampling distribution.
- Type:
float
References
Li, X. and Yao, X., 2012. Cooperatively coevolving particle swarms for large scale optimization. IEEE Transactions on Evolutionary Computation, 16(2), pp.210-224. https://ieeexplore.ieee.org/document/5910380/
Potter, M.A. and De Jong, K.A., 1994, October. A cooperative coevolutionary approach to function optimization. In International Conference on Parallel Problem Solving from Nature (pp. 249-257). Springer, Berlin, Heidelberg. https://link.springer.com/chapter/10.1007/3-540-58484-6_269