Welcome to PyPop7’s Documentation!

https://img.shields.io/badge/GitHub-PyPop7-red.svghttps://img.shields.io/badge/PyPI-pypop7-yellowgreen.svghttps://img.shields.io/badge/license-GNU%20GPL--v3.0-green.svghttps://img.shields.io/badge/OS-Linux%20%7C%20Windows%20%7C%20MacOS%20X-orange.svghttps://readthedocs.org/projects/pypop/badge/?version=latest https://static.pepy.tech/badge/pypop7 https://img.shields.io/badge/Python-3.5%20%7C%203.6%20%7C%203.7%20%7C%203.8%20%7C%203.9%20%7C%203.10-yellow.svg

PyPop7 is a Pure-PYthon library of POPulation-based OPtimization for single-objective, real-parameter, black-box problems. Its main goal is to provide a unified interface and elegant implementations for Black-Box Optimizers (BBO), particularly population-based optimizers, in order to facilitate research repeatability, benchmarking of BBO, and also real-world applications.

More specifically, for alleviating the notorious curse of dimensionality of BBO, the primary focus of PyPop7 is to cover their State-Of-The-Art (SOTA) implementations for Large-Scale Optimization (LSO), though many of their medium/small-scale versions and variants are also included here (mainly for theoretical or benchmarking purposes).

_images/logo.png

Note

This open-source Python library for continuous black-box optimization is under active development (from 2021 to now).

Quick Start

In practice, three simple steps are enough to utilize the optimization power of PyPop7:

  1. Use pip to automatically install pypop7:

    $ pip install pypop7
    
  2. Define your own objective (cost) function (to be minimized) for the optimization problem at hand:

    1>>> import numpy as np  # for numerical computation, which is also the computing engine of pypop7
    2>>> def rosenbrock(x):  # notorious test (benchmarking) function in the optimization community
    3...     return 100*np.sum(np.power(x[1:] - np.power(x[:-1], 2), 2)) + np.sum(np.power(x[:-1] - 1, 2))
    4>>> ndim_problem = 1000  # problem dimension
    5>>> problem = {'fitness_function': rosenbrock,  # cost function to be minimized
    6...            'ndim_problem': ndim_problem,  # problem dimension
    7...            'lower_boundary': -5*np.ones((ndim_problem,)),  # lower search boundary
    8...            'upper_boundary': 5*np.ones((ndim_problem,))}  # upper search boundary
    
  3. Run one or more black-box optimizers from pypop7 on the given optimization problem:

     1>>> from pypop7.optimizers.es.lmmaes import LMMAES  # choose any optimizer you prefer in this library
     2>>> options = {'fitness_threshold': 1e-10,  # terminate when the best-so-far fitness is lower than 1e-10
     3...            'max_runtime': 3600,  # terminate when the actual runtime exceeds 1 hour (i.e. 3600 seconds)
     4...            'seed_rng': 0,  # seed of random number generation (which must be set for repeatability)
     5...            'x': 4*np.ones((ndim_problem,)),  # initial mean of search distribution
     6...            'sigma': 0.3,  # initial global step-size of search distribution
     7...            'verbose': 500}
     8>>> lmmaes = LMMAES(problem, options)  # initialize the optimizer (a unified interface for all optimizers)
     9>>> results = lmmaes.optimize()  # run its (time-consuming) search process
    10>>> # print the best-so-far fitness and used function evaluations returned by the used black-box optimizer
    11>>> print(results['best_so_far_y'], results['n_function_evaluations'])
    129.8774e-11 3928055
    

Contents: