Annealed Random Hill Climber (ARHC)¶
- class pypop7.optimizers.rs.arhc.ARHC(problem, options)¶
Annealed Random Hill Climber (ARHC).
Note
The search performance of ARHC depends heavily on the temperature setting of the annealing process. However, its proper setting is a non-trivial task, since it may vary among different problems and even between different optimization stages for the problem at hand. Therefore, it is highly recommended to first attempt more advanced (e.g. population-based) methods for large-scale black-box optimization.
Here we include it mainly for benchmarking purpose.
- Parameters:
problem (dict) –
- problem arguments with the following common settings (keys):
’fitness_function’ - objective function to be minimized (func),
’ndim_problem’ - number of dimensionality (int),
’upper_boundary’ - upper boundary of search range (array_like),
’lower_boundary’ - lower boundary of search range (array_like).
options (dict) –
- optimizer options with the following common settings (keys):
’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),
’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),
’seed_rng’ - seed for random number generation needed to be explicitly set (int);
- and with the following particular settings (keys):
’sigma’ - initial global step-size (float),
’temperature’ - annealing temperature (float),
’x’ - initial (starting) point (array_like),
if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’], when init_distribution is 1. Otherwise, standard normal distributed random sampling is used.
’init_distribution’ - random sampling distribution for starting-point initialization (int, default: 1). Only when x is not set explicitly, it will be used.
1: uniform distributed random sampling only for starting-point initialization,
0: standard normal distributed random sampling only for starting-point initialization.
Examples
Use the optimizer to minimize the well-known test function Rosenbrock:
1>>> import numpy 2>>> from pypop7.benchmarks.base_functions import rosenbrock # function to be minimized 3>>> from pypop7.optimizers.rs.arhc import ARHC 4>>> problem = {'fitness_function': rosenbrock, # define problem arguments 5... 'ndim_problem': 2, 6... 'lower_boundary': -5*numpy.ones((2,)), 7... 'upper_boundary': 5*numpy.ones((2,))} 8>>> options = {'max_function_evaluations': 5000, # set optimizer options 9... 'seed_rng': 2022, 10... 'x': 3*numpy.ones((2,)), 11... 'sigma': 0.1, 12... 'temperature': 1.5} 13>>> arhc = ARHC(problem, options) # initialize the optimizer class 14>>> results = arhc.optimize() # run the optimization process 15>>> # return the number of used function evaluations and found best-so-far fitness 16>>> print(f"ARHC: {results['n_function_evaluations']}, {results['best_so_far_y']}") 17ARHC: 5000, 0.0002641143073543329
For its correctness checking of coding, refer to this code-based repeatability report for more details.
- init_distribution¶
random sampling distribution for starting-point initialization.
- Type:
int
- sigma¶
global step-size (fixed during optimization).
- Type:
float
- temperature¶
annealing temperature.
- Type:
float
- x¶
initial (starting) point.
- Type:
array_like
References
https://probml.github.io/pml-book/book2.html (See CHAPTER 6.7 Derivative-free optimization)
Russell, S. and Norvig P., 2021. Artificial intelligence: A modern approach (Global Edition). Pearson Education. http://aima.cs.berkeley.edu/ (See CHAPTER 4: SEARCH IN COMPLEX ENVIRONMENTS)
Hoos, H.H. and Stützle, T., 2004. Stochastic local search: Foundations and applications. Elsevier. https://www.elsevier.com/books/stochastic-local-search/hoos/978-1-55860-872-6
https://github.com/pybrain/pybrain/blob/master/pybrain/optimization/hillclimber.py