Pure Random Search (PRS)

class pypop7.optimizers.rs.prs.PRS(problem, options)[source]

Pure Random Search (PRS).

Note

PRS is one of the simplest and earliest black-box optimizers, dating back to at least 1950s. Although recently it has been successfully applied in several relatively low-dimensional problems (particularly hyper-parameter optimization), it generally suffers from the famous curse of dimensionality for large-scale black-box optimization, owing to the lack of adaptation, a highly desirable property for most sophisticated search algorithms. Therefore, it is highly recommended to first attempt more advanced (e.g. population-based) methods for large-scale black-box optimization.

As pointed out in the well-recognized book Probabilistic Machine Learning (written by Kevin Patrick Murphy), “A surprisingly effective strategy in problems where we know nothing about the objective is to use random search. This should always be tried as a baseline”.

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int).

Examples

Use the PRS optimizer to minimize the well-known test function Rosenbrock:

 1>>> import numpy
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.rs.prs import PRS
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5.0*numpy.ones((2,)),
 7...            'upper_boundary': 5.0*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022}
10>>> prs = PRS(problem, options)  # initialize the optimizer class
11>>> results = prs.optimize()  # run the optimization process
12>>> # return the number of used function evaluations and found best-so-far fitness
13>>> print(f"PRS: {results['n_function_evaluations']}, {results['best_so_far_y']}")
14PRS: 5000, 0.11497678820610932

For its correctness checking of coding, refer to this code-based repeatability report for more details.

References

Bergstra, J. and Bengio, Y., 2012. Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(10), pp.281-305.

Schmidhuber, J., Hochreiter, S. and Bengio, Y., 2001. Evaluating benchmark problems by random guessing. A Field Guide to Dynamical Recurrent Networks, pp.231-235.

Karnopp, D.C., 1963. Random search techniques for optimization problems. Automatica, 1(2-3), pp.111-121.

Brooks, S.H., 1959. A comparison of maximum-seeking methods. Operations Research, 7(4), pp.430-457.

Brooks, S.H., 1958. A discussion of random methods for seeking maxima. Operations Research, 6(2), pp.244-251.