Generalized Pattern Search (GPS)

class pypop7.optimizers.ds.gps.GPS(problem, options)

Generalized Pattern Search (GPS).

  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular settings (keys):
    • ’sigma’ - initial global step-size (float, default: 1.0),

    • ’x’ - initial (starting) point (array_like),

      • if not given, it will draw a random sample from the uniform distribution whose search range is bounded by problem[‘lower_boundary’] and problem[‘upper_boundary’].

    • ’gamma’ - decreasing factor of step-size (float, default: 0.5).


Use the optimizer to minimize the well-known test function Rosenbrock:

 1>>> import numpy
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.ds.gps import GPS
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5*numpy.ones((2,)),
 7...            'upper_boundary': 5*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022,
10...            'x': 3*numpy.ones((2,)),
11...            'sigma': 0.1,
12...            'verbose_frequency': 500}
13>>> gps = GPS(problem, options)  # initialize the optimizer class
14>>> results = gps.optimize()  # run the optimization process
15>>> # return the number of function evaluations and best-so-far fitness
16>>> print(f"GPS: {results['n_function_evaluations']}, {results['best_so_far_y']}")
17GPS: 5000, 0.6182686369768672

decreasing factor of step-size.




final global step-size (changed during optimization).




initial (starting) point.




Kochenderfer, M.J. and Wheeler, T.A., 2019. Algorithms for optimization. MIT Press. (See Algorithm 7.6 (Page 106) for details.)

Regis, R.G., 2016. On the properties of positive spanning sets and positive bases. Optimization and Engineering, 17(1), pp.229-262.

Torczon, V., 1997. On the convergence of pattern search algorithms. SIAM Journal on Optimization, 7(1), pp.1-25.