Coordinate Search (CS)¶
- class pypop7.optimizers.ds.cs.CS(problem, options)¶
Coordinate Search (CS).
Note
CS is the earliest Direct (Pattern) Search method, at least dating back to Fermi (The Nobel Prize in Physics 1938) and Metropolis (IEEE Computer Society Computer Pioneer Award 1984). Given that now it is rarely used to optimize black-box problems, it is highly recommended to first attempt other more advanced methods for large-scale black-box optimization (LSBBO).
Its original version needs 3**n - 1 samples for each iteration in the worst case, where n is the dimensionality of the problem. Such a worst-case complexity limits its applicability for LSBBO severely. Instead, here we use the opportunistic strategy for simplicity. See Algorithm 3 from Torczon, 1997, SIOPT for more details.
AKA alternating directions, alternating variable search, axial relaxation, local variation, compass search.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size (float, default: 1.0),
’x’ - initial (starting) point (array_like),
if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].
’gamma’ - decreasing factor of global step-size (float, default: 0.5).
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.ds.cs import CS 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5*numpy.ones((2,)), 7... 'upper_boundary': 5*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # set optimizer options 9... 'seed_rng': 2022, 10... 'x': 3*numpy.ones((2,)), 11... 'sigma': 1.0, 12... 'verbose_frequency': 500} 13>>> cs = CS(problem, options) # initialize the optimizer class 14>>> results = cs.optimize() # run the optimization process 15>>> # return the number of function evaluations and best-so-far fitness 16>>> print(f"CS: {results['n_function_evaluations']}, {results['best_so_far_y']}") 17CS: 5000, 0.1491367032979898
- gamma¶
decreasing factor of global step-size.
- Type:
float
- sigma¶
final global step-size (changed during optimization).
- Type:
float
- x¶
initial (starting) point.
- Type:
array_like
References
Larson, J., Menickelly, M. and Wild, S.M., 2019. Derivative-free optimization methods. Acta Numerica, 28, pp.287-404. https://tinyurl.com/4sr2t63j
Audet, C. and Hare, W., 2017. Derivative-free and blackbox optimization. Berlin: Springer International Publishing. https://link.springer.com/book/10.1007/978-3-319-68913-5
Torczon, V., 1997. On the convergence of pattern search algorithms. SIAM Journal on Optimization, 7(1), pp.1-25. https://epubs.siam.org/doi/abs/10.1137/S1052623493250780 (See Algorithm 3 (Section 4.1) for details.)
Fermi, E. and Metropolis N., 1952. Numerical solution of a minimum problem. Los Alamos Scientific Lab., Los Alamos, NM. https://www.osti.gov/servlets/purl/4377177