Documentations of PyPop7 for Black-Box Optimization (BBO)
PyPop7 is a Pure-PYthon library of POPulation-based OPtimization for single-objective, real-parameter, black-box problems. Its design goal is to provide a unified interface and a set of elegant implementations for Black-Box Optimizers (BBO), particularly population-based optimizers (including evolutionary algorithms, swarm-based random methods, and pattern search), in order to facilitate research repeatability, benchmarking of BBO, and especially real-world applications.
Specifically, for alleviating the well-known (“notorious”) curse of dimensionality of BBO, the primary focus of PyPop7 is to cover their State-Of-The-Art (SOTA) implementations for Large-Scale Optimization (LSO) as much as possible, though many of their medium- or small-scale versions and variants are also included here (some mainly for theoretical purposes, some mainly for educational purposes, some mainly for benchmarking purposes, and some mainly for application purposes on medium or low dimensions).

Note
This open-source Python library for continuous BBO is still under active maintenance. In the future, we plan to add some NEW BBO and some SOTA versions of existing BBO families, in order to make this library as fresh as possible. Any suggestions, extensions, improvements, usages, and tests (even criticisms) to this open-source Python library are highly welcomed!
If this open-source pure-Python library PyPop7 is used in your paper or project, it is highly welcomed but NOT mandatory to cite the following arXiv preprint paper: Duan, Q., Zhou, G., Shao, C., Wang, Z., Feng, M., Huang, Y., Tan, Y., Yang, Y., Zhao, Q. and Shi, Y., 2024. PyPop7: A pure-Python library for population-based black-box optimization. arXiv preprint arXiv:2212.05652. (Now this arXiv paper has been submitted to JMLR, accepted in Fri, 11 Oct 2024 after 3 reviews from Tue, 28 Mar 2023 to Wed, 01 Nov 2023 to Fri, 05 Jul 2024.)
Quick Start
Three steps are often enough to utilize the potential of PyPop7 for BBO in many (though not all) cases:
Please refer to this online documentation for details about multiple installation ways.
Define/code your own objective (aka cost or fitness) function (to be minimized) for the complex optimization problem at hand:
1>>> import numpy as np # for numerical computation, which is also the computing engine used by PyPop7 2>>> def rosenbrock(x): # one notorious test function in the optimization community 3... return 100.0*np.sum(np.square(x[1:] - np.square(x[:-1]))) + np.sum(np.square(x[:-1] - 1.0)) 4>>> ndim_problem = 1000 # problem dimension 5>>> problem = {'fitness_function': rosenbrock, # fitness function to be minimized 6... 'ndim_problem': ndim_problem, # problem dimension 7... 'lower_boundary': -5.0*np.ones((ndim_problem,)), # lower search boundary 8... 'upper_boundary': 5.0*np.ones((ndim_problem,))} # upper search boundary
See this online documentation for details about the problem definition. Note that any maximization problem can be easily transformed into the minimization problem via simply negating.
Please refer to this online documentation for a large set of benchmarking functions from different application fields, which have been provided by PyPop7.
Run one or more black-box optimizers (BBO) from PyPop7 on the above optimization problem:
1>>> from pypop7.optimizers.es.lmmaes import LMMAES # or to choose any black-box optimizer you prefer in PyPop7 2>>> options = {'fitness_threshold': 1e-10, # terminate when the best-so-far fitness is lower than 1e-10 3... 'max_runtime': 3600, # terminate when the actual runtime exceeds 1 hour (i.e., 3600 seconds) 4... 'seed_rng': 0, # seed of random number generation (which should be set for repeatability) 5... 'x': 4.0*np.ones((ndim_problem,)), # initial mean of search/mutation/sampling distribution 6... 'sigma': 3.0, # initial global step-size of search distribution (to be fine-tuned for optimality) 7... 'verbose': 500} 8>>> lmmaes = LMMAES(problem, options) # initialize the black-box optimizer (a unified interface for all optimizers) 9>>> results = lmmaes.optimize() # run its (time-consuming) optimization/evolution/search process 10>>> # print best-so-far fitness and used function evaluations returned by the used black-box optimizer 11>>> print(results['best_so_far_y'], results['n_function_evaluations']) 129.948e-11 2973386
Please refer to this online documentation for details about the optimizer setting. The following API contents are given mainly for all currently available BBO in this seemingly increasingly popular open-source Python library.
Note
A total of four extended versions of PyPop7 (as PP7) are ongoing or planned for further development, as follows:
For Constrained Optimization (PyCoPop7 as PCP7),
For Noisy Optimization (PyNoPop7 as PNP7),
Enhancement via Parallel and Distributed Optimization (PyPop77 as PP77),
Enhancement via Meta-evolution based Optimization (PyMePop7 as PMP7).
Contents of PyPop7:
- Installation
- User Guide
- Online Tutorials
- Lens Shape Optimization
- Lennard-Jones Cluster Optimization
- Global Trajectory Optimization from PyKep
- Benchmarking for Large-Scale Black-Box Optimization (LBO)
- Controller Design/Optimization
- Benchmarking BBO on the Well-Designed COCO Platform
- Benchmarking BBO on the Famous NeverGrad Platform
- More Usage Examples
- Evolution Strategies (ES)
ES
- Limited Memory Covariance Matrix Adaptation (LMCMA)
- Mixture Model-based Evolution Strategy (MMES)
- Diagonal Decoding Covariance Matrix Adaptation (DDCMA)
- Limited Memory Matrix Adaptation Evolution Strategy (LMMAES)
- Rank-M Evolution Strategy (RMES)
- Rank-One Evolution Strategy (R1ES)
- Limited Memory Covariance Matrix Adaptation Evolution Strategy (LMCMAES)
- Fast Matrix Adaptation Evolution Strategy (FMAES)
- Matrix Adaptation Evolution Strategy (MAES)
- Cholesky-CMA-ES 2016 (CCMAES2016)
- (1+1)-Active-CMA-ES 2015 (OPOA2015)
- (1+1)-Active-CMA-ES 2010 (OPOA2010)
- Cholesky-CMA-ES 2009 (CCMAES2009)
- (1+1)-Cholesky-CMA-ES 2009 (OPOC2009)
- Separable Covariance Matrix Adaptation Evolution Strategy (SEPCMAES)
- (1+1)-Cholesky-CMA-ES 2006 (OPOC2006)
- Covariance Matrix Adaptation Evolution Strategy (CMAES)
- Self-Adaptation Matrix Adaptation Evolution Strategy (SAMAES)
- Self-Adaptation Evolution Strategy (SAES)
- Cumulative Step-size Adaptation Evolution Strategy (CSAES)
- Derandomized Self-Adaptation Evolution Strategy (DSAES)
- Schwefel’s Self-Adaptation Evolution Strategy (SSAES)
- Rechenberg’s (1+1)-Evolution Strategy (RES)
- Natural Evolution Strategies (NES)
- Estimation of Distribution Algorithms (EDA)
- Cross-Entropy Method (CEM)
- Differential Evolution (DE)
- Particle Swarm Optimizer (PSO)
PSO
- Cooperative Coevolving Particle Swarm Optimizer (CCPSO2)
- Incremental Particle Swarm Optimizer (IPSO)
- Comprehensive Learning Particle Swarm Optimizer (CLPSO)
- Cooperative Particle Swarm Optimizer (CPSO)
- Standard Particle Swarm Optimizer with a Local topology (SPSOL)
- Standard Particle Swarm Optimizer with a global topology (SPSO)
- Cooperative Coevolution (CC)
- Simulated Annealing (SA)
- Genetic Algorithms (GA)
- Evolutionary Programming (EP)
- Direct/Pattern Search (DS)
- Random Search (RS)
- Bayesian Optimization (BO)
- Black-Box Optimization (BBO)
- Benchmarking Functions for BBO
- Checking of Coding Correctness
- Base Functions
- Shifted/Transformed Forms
- Rotated Forms
- Rotated-Shifted Forms
- Benchmarking for Large-Scale BBO (LBO)
- Black-Box Classification from Data Science
- Benchmarking on Photonics Models from NeverGrad
- Benchmarking of Controllers on Gymnasium
- Lennard-Jones Cluster Optimization from PyGMO
- Test Classes and Data
- Util Functions for BBO
- Development Guide
- Applications
- Sponsor
- Design Philosophy
- Changing Log
- Software Summary