Active Subspace Genetic Algorithm (ASGA)

class pypop7.optimizers.ga.asga.ASGA(problem, options)

Active Subspace Genetic Algorithm (ASGA).

Note

Before running this optimizer, please manually install the dependency called athena-mathlab via pip install athena-mathlab. Currently this dependency is still actively updated and improved, which may result in a failure for our implementation. We expect it to be stable in the near future. In other words, now it should be used for experimental purposes rather than practical purposes.

Parameters:
  • problem (dict) –

    problem arguments with the following common settings (keys):
    • ’fitness_function’ - objective function to be minimized (func),

    • ’ndim_problem’ - number of dimensionality (int),

    • ’upper_boundary’ - upper boundary of search range (array_like),

    • ’lower_boundary’ - lower boundary of search range (array_like).

  • options (dict) –

    optimizer options with the following common settings (keys):
    • ’max_function_evaluations’ - maximum of function evaluations (int, default: np.Inf),

    • ’max_runtime’ - maximal runtime to be allowed (float, default: np.Inf),

    • ’seed_rng’ - seed for random number generation needed to be explicitly set (int);

    and with the following particular setting (key):
    • ’n_init_individuals’ - initial population size (int, default: 2000),

    • ’n_individuals’ - population size (int, default: 200),

    • ’n_subspace’ - dimensionality number of active subspaces (int, default: 1),

    • ’crossover_prob’ - crossover probability (float, default: 0.5),

    • ’mutation_prob’ - mutation probability (float, default: 0.5),

    • ’b’ - number of back-mapped points (int, default: 2).

Examples

Use the optimizer to minimize the well-known test function Rosenbrock:

 1>>> import numpy
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.ga.asga import ASGA
 4>>> problem = {'fitness_function': rosenbrock,  # define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5*numpy.ones((2,)),
 7...            'upper_boundary': 5*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # set optimizer options
 9...            'seed_rng': 2022}
10>>> asga = ASGA(problem, options)  # initialize the optimizer class
11>>> results = asga.optimize()  # run the optimization process
12>>> # return the number of function evaluations and best-so-far fitness:
13>>> # different runs may results in different results,
14>>> #   since randomness of `athena` cannot be explicitly controlled
15>>> print(f"ASGA: {results['n_function_evaluations']}, {results['best_so_far_y']}")
16ASGA: 5000, 0.0012696111562026525

For its correctness checking of coding, refer to this code-based repeatability report for more details.

b

number of back-mapped points.

Type:

int

crossover_prob

crossover probability.

Type:

float

mutation_prob

mutation probability.

Type:

float

n_individuals

population size.

Type:

int

n_init_individuals

initial population size.

Type:

int

n_subspace

dimensionality number of active subspaces.

Type:

int

References

Demo, N., Tezzele, M. and Rozza, G., 2021. A supervised learning approach involving active subspaces for an efficient genetic algorithm in high-dimensional optimization problems. SIAM Journal on Scientific Computing, 43(3), pp.B831-B853. https://epubs.siam.org/doi/10.1137/20M1345219