Py BOBYQA Ation Manual

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 29

Py-BOBYQA Documentation
Release 1.1
Lindon Roberts
24 December 2018
CONTENTS:
1 Installing Py-BOBYQA 3
1.1 Requirements ............................................... 3
1.2 Installation using pip ........................................... 3
1.3 Manual installation ............................................ 3
1.4 Testing .................................................. 4
1.5 Uninstallation .............................................. 4
2 Overview 5
2.1 When to use Py-BOBYQA ........................................ 5
2.2 Details of the Py-BOBYQA Algorithm ................................. 5
2.3 References ................................................ 6
3 Using Py-BOBYQA 7
3.1 Nonlinear Minimization ......................................... 7
3.2 How to use Py-BOBYQA ........................................ 7
3.3 Optional Arguments ........................................... 8
3.4 A Simple Example ............................................ 9
3.5 Adding Bounds and More Output .................................... 10
3.6 Example: Noisy Objective Evaluation .................................. 11
3.7 Example: Global Optimization ...................................... 13
3.8 References ................................................ 14
4 Advanced Usage 15
4.1 General Algorithm Parameters ...................................... 15
4.2 Logging and Output ........................................... 15
4.3 Initialization of Points .......................................... 16
4.4 Trust Region Management ........................................ 16
4.5 Termination on Small Objective Value .................................. 16
4.6 Termination on Slow Progress ...................................... 16
4.7 Stochastic Noise Information ...................................... 16
4.8 Interpolation Management ........................................ 17
4.9 Multiple Restarts ............................................. 17
4.10 References ................................................ 18
5 Diagnostic Information 19
5.1 Current Iterate .............................................. 19
5.2 Trust Region ............................................... 19
5.3 Model Interpolation ........................................... 19
5.4 Iteration Count .............................................. 20
5.5 Algorithm Progress ............................................ 20
i
6 Version History 21
6.1 Version 1.0 (6 Feb 2018) ......................................... 21
6.2 Version 1.0.1 (20 Feb 2018) ....................................... 21
6.3 Version 1.0.2 (20 Jun 2018) ....................................... 21
6.4 Version 1.1 (24 Dec 2018) ........................................ 21
7 Acknowledgements 23
Bibliography 25
ii
Py-BOBYQA Documentation, Release 1.1
Release: 1.1
Date: 24 December 2018
Author: Lindon Roberts
Py-BOBYQA is a flexible package for finding local solutions to nonlinear, nonconvex minimization problems (with
optional bound constraints), without requiring any derivatives of the objective. Py-BOBYQA is a Python implementa-
tion of the BOBYQA solver by Powell (documentation here). It is particularly useful when evaluations of the objective
function are expensive and/or noisy.
That is, Py-BOBYQA solves
min
𝑥R𝑛𝑓(𝑥)
s.t. 𝑎𝑥𝑏
Full details of the Py-BOBYQA algorithm are given in our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts,
Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, technical report,
University of Oxford, (2018).
If you are interested in solving least-squares minimization problems, you may wish to try DFO-LS, which has the
same features as Py-BOBYQA (plus some more), and exploits the least-squares problem structure, so performs better
on such problems.
New feature: global optimization heuristic (July 2018)! Py-BOBYQA now has a heuristic for global optimization
(see Using Py-BOBYQA for details). As this is a heuristic, there are no guarantees it will find a global minimum, but
it is more likely to escape local minima if there are better values nearby.
Py-BOBYQA is released under the GNU General Public License. Please contact NAG for alternative licensing.
CONTENTS: 1
Py-BOBYQA Documentation, Release 1.1
2 CONTENTS:
CHAPTER
ONE
INSTALLING PY-BOBYQA
1.1 Requirements
Py-BOBYQA requires the following software to be installed:
Python 2.7 or Python 3
Additionally, the following python packages should be installed (these will be installed automatically if using pip, see
Installation using pip):
NumPy 1.11 or higher
SciPy 0.18 or higher
Pandas 0.17 or higher
1.2 Installation using pip
For easy installation, use pip as root:
$[sudo]pip install Py-BOBYQA
If you do not have root privileges or you want to install Py-BOBYQA for your private use, you can use:
$ pip install --user Py-BOBYQA
which will install Py-BOBYQA in your home directory.
Note that if an older install of Py-BOBYQA is present on your system you can use:
$[sudo]pip install --upgrade Py-BOBYQA
to upgrade Py-BOBYQA to the latest version.
1.3 Manual installation
The source code for Py-BOBYQA is available on Github:
$ git clone https://github.com/numericalalgorithmsgroup/pybobyqa
$cd pybobyqa
Py-BOBYQA is written in pure Python and requires no compilation. It can be installed using:
3
Py-BOBYQA Documentation, Release 1.1
$[sudo]pip install .
If you do not have root privileges or you want to install Py-BOBYQA for your private use, you can use:
$ pip install --user .
instead.
To upgrade Py-BOBYQA to the latest version, navigate to the top-level directory (i.e. the one containing setup.py)
and rerun the installation using pip, as above:
$ git pull
$[sudo]pip install . # with admin privileges
1.4 Testing
If you installed Py-BOBYQA manually, you can test your installation by running:
$ python setup.py test
1.5 Uninstallation
If Py-BOBYQA was installed using pip you can uninstall as follows:
$[sudo]pip uninstall Py-BOBYQA
If Py-BOBYQA was installed manually you have to remove the installed files by hand (located in your python site-
packages directory).
4 Chapter 1. Installing Py-BOBYQA
CHAPTER
TWO
OVERVIEW
2.1 When to use Py-BOBYQA
Py-BOBYQA is designed to solve the nonlinear least-squares minimization problem (with optional bound constraints)
min
𝑥R𝑛𝑓(𝑥)
s.t. 𝑎𝑥𝑏
We call 𝑓(𝑥)the objective function.
Py-BOBYQA is a derivative-free optimization algorithm, which means it does not require the user to provide the
derivatives of 𝑓(𝑥), nor does it attempt to estimate them internally (by using finite differencing, for instance).
There are two main situations when using a derivative-free algorithm (such as Py-BOBYQA) is preferable to a
derivative-based algorithm (which is the vast majority of least-squares solvers).
If the residuals are noisy, then calculating or even estimating their derivatives may be impossible (or at least very
inaccurate). By noisy, we mean that if we evaluate 𝑓(𝑥)multiple times at the same value of 𝑥, we get different
results. This may happen when a Monte Carlo simulation is used, for instance, or 𝑓(𝑥)involves performing a physical
experiment.
If the residuals are expensive to evaluate, then estimating derivatives (which requires 𝑛evaluations of 𝑓(𝑥)for every
point of interest 𝑥) may be prohibitively expensive. Derivative-free methods are designed to solve the problem with
the fewest number of evaluations of the objective as possible.
However, if you have provide (or a solver can estimate) derivatives of 𝑓(𝑥), then it is probably a good idea to use
one of the many derivative-based solvers (such as one from the SciPy library).
2.2 Details of the Py-BOBYQA Algorithm
Py-BOBYQA is a type of trust-region method, a common category of optimization algorithms for nonconvex
problems. Given a current estimate of the solution 𝑥𝑘, we compute a model which approximates the objective
𝑚𝑘(𝑠)𝑓(𝑥𝑘+𝑠)(for small steps 𝑠), and maintain a value 𝑘>0(called the trust region radius) which measures
the size of 𝑠for which the approximation is good.
At each step, we compute a trial step 𝑠𝑘designed to make our approximation 𝑚𝑘(𝑠)small (this task is called the trust
region subproblem). We evaluate the objective at this new point, and if this provided a good decrease in the objective,
we take the step (𝑥𝑘+1 =𝑥𝑘+𝑠𝑘), otherwise we stay put (𝑥𝑘+1 =𝑥𝑘). Based on this information, we choose a new
value 𝑘+1, and repeat the process.
In Py-BOBYQA, we construct our approximation 𝑚𝑘(𝑠)by interpolating a linear or quadratic approximation for 𝑓(𝑥)
at several points close to 𝑥𝑘. To make sure our interpolated model is accurate, we need to regularly check that the
points are well-spaced, and move them if they aren’t (i.e. improve the geometry of our interpolation points).
5
Py-BOBYQA Documentation, Release 1.1
Py-BOBYQA is a Python implementation of the BOBYQA solver by Powell [Powell2009]. More details about Py-
BOBYQA algorithm are given in our paper [CFMR2018].
2.3 References
6 Chapter 2. Overview
CHAPTER
THREE
USING PY-BOBYQA
This section describes the main interface to Py-BOBYQA and how to use it.
3.1 Nonlinear Minimization
Py-BOBYQA is designed to solve the local optimization problem
min
𝑥R𝑛𝑓(𝑥)
s.t. 𝑎𝑥𝑏
where the bound constraints 𝑎𝑥𝑏are optional. The objective function 𝑓(𝑥)is usually nonlinear and non-
quadratic. If you know your objective is linear or quadratic, you should consider a solver designed for such functions
(see here for details).
Py-BOBYQA iteratively constructs an interpolation-based model for the objective, and determines a step using a
trust-region framework. For an in-depth technical description of the algorithm see the paper [CFMR2018].
3.2 How to use Py-BOBYQA
The main interface to Py-BOBYQA is via the function solve
soln =pybobyqa.solve(objfun, x0)
The input objfun is a Python function which takes an input 𝑥R𝑛and returns the objective value 𝑓(𝑥)R. The
input of objfun must be one-dimensional NumPy arrays (i.e. with x.shape == (n,)) and the output must be a
single Float.
The input x0 is the starting point for the solver, and (where possible) should be set to be the best available estimate
of the true solution 𝑥𝑚𝑖𝑛 R𝑛. It should be specified as a one-dimensional NumPy array (i.e. with x0.shape ==
(n,)). As Py-BOBYQA is a local solver, providing different values for x0 may cause it to return different solutions,
with possibly different objective values.
The output of pybobyqa.solve is an object containing:
soln.x - an estimate of the solution, 𝑥𝑚𝑖𝑛 R𝑛, a one-dimensional NumPy array.
soln.f - the objective value at the calculated solution, 𝑓(𝑥𝑚𝑖𝑛), a Float.
soln.gradient - an estimate of the gradient vector of first derivatives of the objective, 𝑔𝑖𝜕𝑓(𝑥𝑚𝑖𝑛)/𝜕𝑥𝑖,
a NumPy array of length 𝑛.
soln.hessian - an estimate of the Hessian matrix of second derivatives of the objective, 𝐻𝑖,𝑗
𝜕2𝑓(𝑥𝑚𝑖𝑛)/𝜕𝑥𝑖𝜕𝑥𝑗, a NumPy array of size 𝑛×𝑛.
7
Py-BOBYQA Documentation, Release 1.1
soln.nf - the number of evaluations of objfun that the algorithm needed, an Integer.
soln.nx - the number of points 𝑥at which objfun was evaluated, an Integer. This may be different to
soln.nf if sample averaging is used.
soln.nruns - the number of runs performed by Py-BOBYQA (more than 1 if using multiple restarts), an
Integer.
soln.flag - an exit flag, which can take one of several values (listed below), an Integer.
soln.msg - a description of why the algorithm finished, a String.
soln.diagnostic_info - a table of diagnostic information showing the progress of the solver, a Pandas
DataFrame.
The possible values of soln.flag are defined by the following variables:
soln.EXIT_SUCCESS - Py-BOBYQA terminated successfully (the objective value or trust region radius are
sufficiently small).
soln.EXIT_MAXFUN_WARNING - maximum allowed objective evaluations reached. This is the most likely
return value when using multiple restarts.
soln.EXIT_SLOW_WARNING - maximum number of slow iterations reached.
soln.EXIT_FALSE_SUCCESS_WARNING - Py-BOBYQA reached the maximum number of restarts which
decreased the objective, but to a worse value than was found in a previous run.
soln.EXIT_INPUT_ERROR - error in the inputs.
soln.EXIT_TR_INCREASE_ERROR - error occurred when solving the trust region subproblem.
soln.EXIT_LINALG_ERROR - linear algebra error, e.g. the interpolation points produced a singular linear
system.
These variables are defined in the soln object, so can be accessed with, for example
if soln.flag == soln.EXIT_SUCCESS:
print("Success!")
3.3 Optional Arguments
The solve function has several optional arguments which the user may provide:
pybobyqa.solve(objfun, x0, args=(), bounds=None, npt=None,
rhobeg=None, rhoend=1e-8, maxfun=None, nsamples=None,
user_params=None, objfun_has_noise=False,
seek_global_minimum=False,
scaling_within_bounds=False)
These arguments are:
args - a tuple of extra arguments passed to the objective function.
bounds - a tuple (lower, upper) with the vectors 𝑎and 𝑏of lower and upper bounds on 𝑥(default is
𝑎𝑖=1020 and 𝑏𝑖= 1020). To set bounds for either lower or upper, but not both, pass a tuple (lower,
None) or (None, upper).
npt - the number of interpolation points to use (default is 2𝑛+ 1 for a problem with len(x0)=n if
objfun_has_noise=False, otherwise it is set to (𝑛+ 1)(𝑛+ 2)/2). Py-BOBYQA requires 𝑛+ 1
𝑛𝑝𝑡 (𝑛+ 1)(𝑛+ 2)/2. Larger values are particularly useful for noisy problems.
8 Chapter 3. Using Py-BOBYQA
Py-BOBYQA Documentation, Release 1.1
rhobeg - the initial value of the trust region radius (default is 0.1 if scaling_within_bounds=True,
otherwise 0.1 max(𝑥0,1)).
rhoend - minimum allowed value of trust region radius, which determines when a successful termination
occurs (default is 108).
maxfun - the maximum number of objective evaluations the algorithm may request (default is min(100(𝑛+
1),1000)).
nsamples - a Python function nsamples(delta, rho, iter, nrestarts) which returns the num-
ber of times to evaluate objfun at a given point. This is only applicable for objectives with stochastic noise,
when averaging multiple evaluations at the same point produces a more accurate value. The input parameters
are the trust region radius (delta), the lower bound on the trust region radius (rho), how many iterations the
algorithm has been running for (iter), and how many restarts have been performed (nrestarts). Default is
no averaging (i.e. nsamples(delta, rho, iter, nrestarts)=1).
user_params - a Python dictionary {'param1': val1, 'param2':val2, ...} of optional pa-
rameters. A full list of available options is given in the next section Advanced Usage.
objfun_has_noise - a flag to indicate whether or not objfun has stochastic noise; i.e. will calling
objfun(x) multiple times at the same value of xgive different results? This is used to set some sensible
default parameters (including using multiple restarts), all of which can be overridden by the values provided in
user_params.
seek_global_minimum - a flag to indicate whether to search for a global minimum, rather than a local
minimum. This is used to set some sensible default parameters, all of which can be overridden by the values
provided in user_params. If True, both upper and lower bounds must be set. Note that Py-BOBYQA only
implements a heuristic method, so there are no guarantees it will find a global minimum. However, by using
this flag, it is more likely to escape local minima if there are better values nearby. The method used is a multiple
restart mechanism, where we repeatedly re-initialize Py-BOBYQA from the best point found so far, but where
we use a larger trust reigon radius each time (note: this is different to more common multi-start approach to
global optimization).
scaling_within_bounds - a flag to indicate whether the algorithm should internally shift and scale the
entries of xso that the bounds become 0𝑥1. This is useful is you are setting bounds and the bounds have
different orders of magnitude. If scaling_within_bounds=True, the values of rhobeg and rhoend
apply to the shifted variables.
In general when using optimization software, it is good practice to scale your variables so that moving each by a
given amount has approximately the same impact on the objective function. The scaling_within_bounds flag
is designed to provide an easy way to achieve this, if you have set the bounds lower and upper.
3.4 A Simple Example
Suppose we wish to minimize the Rosenbrock test function:
min
(𝑥1,𝑥2)R2100(𝑥2𝑥2
1)2+ (1 𝑥1)2
This function has exactly one local minimum 𝑓(𝑥𝑚𝑖𝑛)=0at 𝑥𝑚𝑖𝑛 = (1,1). A commonly-used starting point for
testing purposes is 𝑥0= (1.2,1). The following script shows how to solve this problem using Py-BOBYQA:
# Py-BOBYQA example: minimize the Rosenbrock function
from __future__ import print_function
import numpy as np
import pybobyqa
# Define the objective function
3.4. A Simple Example 9
Py-BOBYQA Documentation, Release 1.1
def rosenbrock(x):
return 100.0 *(x[1]-x[0]** 2)** 2+(1.0 -x[0]) ** 2
# Define the starting point
x0 =np.array([-1.2,1.0])
# Set random seed (for reproducibility)
np.random.seed(0)
# Call Py-BOBYQA
soln =pybobyqa.solve(rosenbrock, x0)
# Display output
print(soln)
Note that Py-BOBYQA is a randomized algorithm: in its first phase, it builds an internal approximation to the objective
function by sampling it along random directions. In the code above, we set NumPy’s random seed for reproducibility
over multiple runs, but this is not required. The output of this script, showing that Py-BOBYQA finds the correct
solution, is
****** Py-BOBYQA Results ******
Solution xmin = [ 1. 1.]
Objective value f(xmin) = 2.964036794e-19
Needed 213 objective evaluations (at 213 points)
Approximate gradient = [ -2.57280154e-08 1.26855793e-08]
Approximate Hessian = [[ 802.90904563 -400.46022134]
[-400.46022134 200.23335154]]
Exit flag = 0
Success: rho has reached rhoend
******************************
This and all following problems can be found in the examples directory on the Py-BOBYQA Github page.
3.5 Adding Bounds and More Output
We can extend the above script to add constraints. To do this, we can add the lines
# Define bound constraints (lower <= x <= upper)
lower =np.array([-10.0,-10.0])
upper =np.array([0.9,0.85])
# Call Py-BOBYQA (with bounds)
soln =pybobyqa.solve(rosenbrock, x0, bounds=(lower,upper))
Py-BOBYQA correctly finds the solution to the constrained problem:
****** Py-BOBYQA Results ******
Solution xmin = [ 0.9 0.81]
Objective value f(xmin) = 0.01
Needed 134 objective evaluations (at 134 points)
Approximate gradient = [ -1.99999226e-01 -4.31078784e-07]
Approximate Hessian = [[ 649.6790222 -360.18361979]
[-360.18361979 200.00205196]]
Exit flag = 0
Success: rho has reached rhoend
******************************
10 Chapter 3. Using Py-BOBYQA
Py-BOBYQA Documentation, Release 1.1
However, we also get a warning that our starting point was outside of the bounds:
RuntimeWarning: x0 above upper bound, adjusting
Py-BOBYQA automatically fixes this, and moves 𝑥0to a point within the bounds, in this case 𝑥0= (1.2,0.85).
We can also get Py-BOBYQA to print out more detailed information about its progress using the logging module. To
do this, we need to add the following lines:
import logging
logging.basicConfig(level=logging.INFO, format='%(message)s')
# ... (call pybobyqa.solve)
And we can now see each evaluation of objfun:
Function eval 1 at point 1 has f = 39.65 at x = [-1.2 0.85]
Initialising (random directions)
Function eval 2 at point 2 has f = 14.337296 at x = [-1.08 0.85]
Function eval 3 at point 3 has f = 55.25 at x = [-1.2 0.73]
...
Function eval 133 at point 133 has f = 0.0100000000000165 at x = [ 0.9
˓0.81000001]
Function eval 134 at point 134 has f = 0.00999999999999997 at x = [ 0.9 0.
˓81]
Did a total of 1 run(s)
If we wanted to save this output to a file, we could replace the above call to logging.basicConfig() with
logging.basicConfig(filename="myfile.log", level=logging.INFO,
format='%(message)s', filemode='w')
3.6 Example: Noisy Objective Evaluation
As described in Overview, derivative-free algorithms such as Py-BOBYQA are particularly useful when objfun has
noise. Let’s modify the previous example to include random noise in our objective evaluation, and compare it to a
derivative-based solver:
# Py-BOBYQA example: minimize the noisy Rosenbrock function
from __future__ import print_function
import numpy as np
import pybobyqa
# Define the objective function
def rosenbrock(x):
return 100.0 *(x[1]-x[0]** 2)** 2+(1.0 -x[0]) ** 2
# Modified objective function: add 1% Gaussian noise
def rosenbrock_noisy(x):
return rosenbrock(x) *(1.0 +1e-2 *np.random.normal(size=(1,))[0])
# Define the starting point
x0 =np.array([-1.2,1.0])
3.6. Example: Noisy Objective Evaluation 11
Py-BOBYQA Documentation, Release 1.1
# Set random seed (for reproducibility)
np.random.seed(0)
print("Demonstrate noise in function evaluation:")
for iin range(5):
print("objfun(x0) = %s"%str(rosenbrock_noisy(x0)))
print("")
# Call Py-BOBYQA
soln =pybobyqa.solve(rosenbrock_noisy, x0)
# Display output
print(soln)
# Compare with a derivative-based solver
import scipy.optimize as opt
soln =opt.minimize(rosenbrock_noisy, x0)
print("")
print("** SciPy results **")
print("Solution xmin = %s"%str(soln.x))
print("Objective value f(xmin) = %.10g"%(soln.fun))
print("Needed %g objective evaluations" %soln.nfev)
print("Exit flag = %g"%soln.status)
print(soln.message)
The output of this is:
Demonstrate noise in function evaluation:
objfun(x0) = 24.6269006677
objfun(x0) = 24.2968380444
objfun(x0) = 24.4368545922
objfun(x0) = 24.7422961542
objfun(x0) = 24.6519490336
****** Py-BOBYQA Results ******
Solution xmin = [-1.02866429 1.07341548]
Objective value f(xmin) = 4.033118937
Needed 36 objective evaluations (at 36 points)
Approximate gradient = [-6921247.2999239 -3051622.27188687]
Approximate Hessian = [[ 1.98604897e+15 5.75929121e+14]
[ 5.75929121e+14 7.89533101e+14]]
Exit flag = 0
Success: rho has reached rhoend
******************************
** SciPy results **
Solution xmin = [-1.2 1. ]
Objective value f(xmin) = 23.80943672
Needed 104 objective evaluations
Exit flag = 2
Desired error not necessarily achieved due to precision loss.
Although Py-BOBYQA does not find the true solution (and it cannot produce a good estimate of the objective gradient
and Hessian), it still gives a reasonable decrease in the objective. However SciPy’s derivative-based solver, which has
no trouble solving the noise-free problem, is unable to make any progress.
12 Chapter 3. Using Py-BOBYQA
Py-BOBYQA Documentation, Release 1.1
As noted above, Py-BOBYQA has an input parameter objfun_has_noise to indicate if objfun has noise in it,
which it does in this case. Therefore we can call Py-BOBYQA with
soln =pybobyqa.solve(rosenbrock_noisy, x0, objfun_has_noise=True)
This time, we find the true solution, and better estimates of the gradient and Hessian:
****** Py-BOBYQA Results ******
Solution xmin = [ 1. 1.]
Objective value f(xmin) = 3.418770987e-18
Needed 300 objective evaluations (at 300 points)
Did a total of 4 runs
Approximate gradient = [ -1.36175005e-08 2.12249758e-09]
Approximate Hessian = [[ 805.93202374 -394.16671315]
[-394.16671315 192.99451721]]
Exit flag = 1
Warning (max evals): Objective has been called MAXFUN times
******************************
3.7 Example: Global Optimization
The following example shows how to use the global optimization features of Py-BOBYQA. Here, we try to minimize
the Freudenstein and Roth function (problem 2 in J.J. Moré, B.S. Garbow, B.S. and K.E. Hillstrom, Testing Uncon-
strained Optimization Software, ACM Trans. Math. Software 7:1 (1981), 17-41). This function has two local minima,
one of which is global.
Note that Py-BOBYQA only implements a heuristic method, so there are no guarantees it will find a global minimum.
However, by using the seek_global_minimum flag, it is more likely to escape local minima if there are better
values nearby.
# Py-BOBYQA example: globally minimize the Freudenstein and Roth function
from __future__ import print_function
import numpy as np
import pybobyqa
# Define the objective function
# This function has a local minimum f = 48.98
# at x = np.array([11.41, -0.8968])
# and a global minimum f = 0 at x = np.array([5.0, 4.0])
def freudenstein_roth(x):
r1 = -13.0 +x[0]+((5.0 -x[1]) *x[1]-2.0)*x[1]
r2 = -29.0 +x[0]+((1.0 +x[1]) *x[1]-14.0)*x[1]
return r1 ** 2+r2 ** 2
# Define the starting point
x0 =np.array([5.0,-20.0])
# Define bounds (required for global optimization)
lower =np.array([-30.0,-30.0])
upper =np.array([30.0,30.0])
# Set random seed (for reproducibility)
np.random.seed(0)
print("First run - search for local minimum only")
print("")
3.7. Example: Global Optimization 13
Py-BOBYQA Documentation, Release 1.1
soln =pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
bounds=(lower, upper))
print(soln)
print("")
print("")
print("Second run - search for global minimum")
print("")
soln =pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
bounds=(lower, upper),
seek_global_minimum=True)
print(soln)
The output of this is:
First run - search for local minimum only
****** Py-BOBYQA Results ******
Solution xmin = [ 11.41277906 -0.89680525]
Objective value f(xmin) = 48.98425368
Needed 203 objective evaluations (at 203 points)
Approximate gradient = [ -1.61348180e-06 -3.61662651e-07]
Approximate Hessian = [[ 132.10265455 -45.5426821 ]
[ -45.5426821 976.15808779]]
Exit flag = 0
Success: rho has reached rhoend
******************************
Second run - search for global minimum
****** Py-BOBYQA Results ******
Solution xmin = [ 5. 4.]
Objective value f(xmin) = 9.734692105e-19
Needed 500 objective evaluations (at 500 points)
Did a total of 4 runs
Approximate gradient = [ 4.28964221e-08 4.58344260e-07]
Approximate Hessian = [[ 4.06992486 61.15006935]
[ 61.15006935 3728.06826545]]
Exit flag = 1
Warning (max evals): Objective has been called MAXFUN times
******************************
As we can see, the seek_global_minimum flag helped Py-BOBYQA escape the local minimum from the first
run, and find the global minimum.
3.8 References
14 Chapter 3. Using Py-BOBYQA
CHAPTER
FOUR
ADVANCED USAGE
This section describes different optional user parameters available in Py-BOBYQA.
In the last section (Using Py-BOBYQA), we introduced pybobyqa.solve(), which has the optional input
user_params. This is a Python dictionary of user parameters. We will now go through the settings which can
be changed in this way. More details are available in the paper [CFMR2018].
The default values, used if no override is given, in some cases vary depending on whether objfun has stochastic
noise; that is, whether evaluating objfun(x) several times at the same xgives the same result or not. Whether or not
this is the case is determined by the objfun_has_noise input to pybobyqa.solve() (and not by inspecting
objfun, for instance). Similarly, the default values depend on the input flag seek_global_minimum, i.e. if a
global minimum is desired.
4.1 General Algorithm Parameters
general.rounding_error_constant - Internally, all interpolation points are stored with respect to a
base point 𝑥𝑏; that is, we store {𝑦𝑡𝑥𝑏}, which reduces the risk of roundoff errors. We shift 𝑥𝑏to 𝑥𝑘when
𝑠𝑘‖ ≤ const𝑥𝑘𝑥𝑏, where ‘const’ is this parameter. Default is 0.1.
general.safety_step_thresh - Threshold for when to call the safety step, 𝑠𝑘‖ ≤ 𝛾𝑆𝜌𝑘. Default is
𝛾𝑆= 0.5.
general.check_objfun_for_overflow - Whether to cap the value of 𝑟𝑖(𝑥)when they are large
enough that an OverflowError will be encountered when trying to evaluate 𝑓(𝑥). Default is True.
4.2 Logging and Output
logging.n_to_print_whole_x_vector - If printing all function evaluations to screen/log file, the
maximum len(x) for which the full vector xshould be printed also. Default is 6.
logging.save_diagnostic_info - Flag so save diagnostic information at each iteration. Default is
False.
logging.save_poisedness - If saving diagnostic information, whether to include the Λ-poisedness of
𝑌𝑘in the diagnostic information. This is the most computationally expensive piece of diagnostic information.
Default is True.
logging.save_xk - If saving diagnostic information, whether to include the full vector 𝑥𝑘. Default is
False.
15
Py-BOBYQA Documentation, Release 1.1
4.3 Initialization of Points
init.random_initial_directions - Build the initial interpolation set using random directions (as
opposed to coordinate directions). Default is True.
init.random_directions_make_orthogonal - If building initial interpolation set with random di-
rections, whether or not these should be orthogonalized. Default is True.
init.run_in_parallel - If using random directions, whether or not to ask for all objfun to be evaluated
at all points without any intermediate processing. Default is False.
4.4 Trust Region Management
tr_radius.eta1 - Threshold for unsuccessful trust region iteration, 𝜂1. Default is 0.1.
tr_radius.eta2 - Threshold for very successful trust region iteration, 𝜂2. Default is 0.7.
tr_radius.gamma_dec - Ratio to decrease 𝑘in unsuccessful iteration, 𝛾𝑑𝑒𝑐. Default is 0.5 for smooth
problems or 0.98 for noisy problems (i.e. objfun_has_noise = True).
tr_radius.gamma_inc - Ratio to increase 𝑘in very successful iterations, 𝛾𝑖𝑛𝑐. Default is 2.
tr_radius.gamma_inc_overline - Ratio of 𝑠𝑘to increase 𝑘by in very successful iterations, 𝛾𝑖𝑛𝑐.
Default is 4.
tr_radius.alpha1 - Ratio to decrease 𝜌𝑘by when it is reduced, 𝛼1. Default is 0.1 for smooth problems or
0.9 for noisy problems (i.e. objfun_has_noise = True).
tr_radius.alpha2 - Ratio of 𝜌𝑘to decrease 𝑘by when 𝜌𝑘is reduced, 𝛼2. Default is 0.5 for smooth
problems or 0.95 for noisy problems (i.e. objfun_has_noise = True).
4.5 Termination on Small Objective Value
model.abs_tol - Tolerance on 𝑓(𝑥𝑘); quit if 𝑓(𝑥𝑘)is below this value. Default is 1020.
4.6 Termination on Slow Progress
slow.history_for_slow - History used to determine whether the current iteration is ‘slow’. Default is 5.
slow.thresh_for_slow - Threshold for objective decrease used to determine whether the current iteration
is ‘slow’. Default is 108.
slow.max_slow_iters - Number of consecutive slow successful iterations before termination (or restart).
Default is 20*len(x0).
4.7 Stochastic Noise Information
noise.quit_on_noise_level - Flag to quit (or restart) if all 𝑓(𝑦𝑡)are within noise level of 𝑓(𝑥𝑘).
Default is False for smooth problems or True for noisy problems.
noise.scale_factor_for_quit - Factor of noise level to use in termination criterion. Default is 1.
16 Chapter 4. Advanced Usage
Py-BOBYQA Documentation, Release 1.1
noise.multiplicative_noise_level - Multiplicative noise level in 𝑓. Can only specify one of mul-
tiplicative or additive noise levels. Default is None.
noise.additive_noise_level - Additive noise level in 𝑓. Can only specify one of multiplicative or
additive noise levels. Default is None.
4.8 Interpolation Management
interpolation.precondition - whether or not to scale the interpolation linear system to improve con-
ditioning. Default is True.
interpolation.minimum_change_hessian - whether to solve the underdetermined quadratic inter-
polation problem by minimizing the Frobenius norm of the Hessian, or change in Hessian. Default is True.
4.9 Multiple Restarts
restarts.use_restarts - Whether to do restarts when 𝜌𝑘reaches 𝜌𝑒𝑛𝑑, or (optionally) when all points
are within noise level of 𝑓(𝑥𝑘). Default is False for smooth problems or True for noisy problems or when
seeking a global minimum.
restarts.max_unsuccessful_restarts - Maximum number of consecutive unsuccessful restarts al-
lowed (i.e.~restarts which did not reduce the objective further). Default is 10.
restarts.max_unsuccessful_restarts_total - Maximum number of total unsuccessful restarts
allowed. Default is 20 when seeking a global minimum, otherwise it is maxfun (i.e.~not restricted).
restarts.rhobeg_scale_after_unsuccessful_restart - Factor to increase 𝜌𝑏𝑒𝑔 by after un-
successful restarts. Default is 1.1 when seeking a global minimum, otherwise it is 1.
restarts.rhoend_scale - Factor to reduce 𝜌𝑒𝑛𝑑 by with each restart. Default is 1.
restarts.use_soft_restarts - Whether to use soft or hard restarts. Default is True.
restarts.soft.num_geom_steps - For soft restarts, the number of points to move. Default is 3.
restarts.soft.move_xk - For soft restarts, whether to preserve 𝑥𝑘, or move it to the best new point
evaluated. Default is True.
restarts.hard.use_old_fk - If using hard restarts, whether or not to recycle the objective value at the
best iterate found when performing a restart. This saves one objective evaluation. Default is True.
restarts.soft.max_fake_successful_steps - The maximum number of successful steps in a
given run where the new (smaller) objective value is larger than the best value found in a previous run. De-
fault is maxfun, the input to pybobyqa.solve().
restarts.auto_detect - Whether or not to automatically determine when to restart. This is an extra
condition, and restarts can still be triggered by small trust region radius, etc. Default is True.
restarts.auto_detect.history - How many iterations of data on model changes and trust region
radii to store. There are two criteria used: trust region radius decreases (no increases over the history, more
decreases than no changes), and change in model Jacobian (consistently increasing trend as measured by slope
and correlation coefficient of line of best fit). Default is 30.
restarts.auto_detect.min_chg_model_slope - Minimum rate of increase of log(𝑔𝑘𝑔𝑘1)
and log(𝐻𝑘𝐻𝑘1𝐹)over the past iterations to cause a restart. Default is 0.015.
restarts.auto_detect.min_correl - Minimum correlation of the data sets (𝑘, log(𝑔𝑘𝑔𝑘1))
and (𝑘, log(𝐻𝑘𝐻𝑘1𝐹)) required to cause a restart. Default is 0.1.
4.8. Interpolation Management 17
Py-BOBYQA Documentation, Release 1.1
4.10 References
18 Chapter 4. Advanced Usage
CHAPTER
FIVE
DIAGNOSTIC INFORMATION
In Using Py-BOBYQA, we saw that the output of Py-BOBYQA returns a container which includes diagnostic infor-
mation about the progress of the algorithm (soln.diagnostic_info). This object is a Pandas DataFrame, with
one row per iteration of the algorithm. In this section, we explain the meaning of each type of output (the columns of
the DataFrame).
To save this information to a CSV file, use:
# Previously: define objfun and x0
# Turn on diagnostic information
user_params ={'logging.save_diagnostic_info':True}
# Call Py-BOBYQA
soln =pybobyqa.solve(objfun, x0, user_params=user_params)
# Save diagnostic info to CSV
soln.diagnostic_info.to_csv("myfile.csv")
Depending on exactly how Py-BOBYQA terminates, the last row of results may not be fully populated.
5.1 Current Iterate
xk - Best point found so far (current iterate). This is only saved if user_params['logging.save_xk']
= True.
fk - The value of 𝑓at the current iterate.
5.2 Trust Region
rho - The lower bound on the trust region radius 𝜌𝑘.
delta - The trust region radius 𝑘.
norm_sk - The norm of the trust region step 𝑠𝑘.
5.3 Model Interpolation
npt - The number of interpolation points.
interpolation_error - The sum of squares of the interpolation errors from the interpolated model.
19
Py-BOBYQA Documentation, Release 1.1
interpolation_condition_number - The condition number of the matrix in the interpolation linear
system.
interpolation_change_g_norm - The norm of the change in model gradient at this iteration, 𝑔𝑘
𝑔𝑘1.
interpolation_change_H_norm - The Frobenius norm of the change in model Hessian at this iteration,
𝐻𝑘𝐻𝑘1𝐹.
poisedness - The smallest value of Λfor which the current interpolation set 𝑌𝑘is Λ-poised in the cur-
rent trust region. This is the most expensive piece of information to compute, and is only computed if
user_params['logging.save_poisedness' = True.
max_distance_xk - The maximum distance from any interpolation point to the current iterate.
norm_gk - The norm of the model gradient 𝑔𝑘.
5.4 Iteration Count
nruns - The number of times the algorithm has been restarted.
nf - The number of objective evaluations so far (see soln.nf)
nx - The number of points at which the objective has been evaluated so far (see soln.nx)
nsamples - The total number of objective evaluations used for all current interpolation points.
iter_this_run - The number of iterations since the last restart.
iters_total - The total number of iterations so far.
5.5 Algorithm Progress
iter_type - A text description of what type of iteration we had (e.g. Successful, Safety, etc.)
ratio - The ratio of actual to predicted objective reduction in the trust region step.
slow_iter - Equal to 1 if the current iteration is successful but slow, 0 if is successful but not slow, and -1 if
was not successful.
20 Chapter 5. Diagnostic Information
CHAPTER
SIX
VERSION HISTORY
This section lists the different versions of Py-BOBYQA and the updates between them.
6.1 Version 1.0 (6 Feb 2018)
Initial release of Py-BOBYQA
6.2 Version 1.0.1 (20 Feb 2018)
Minor bug fix to trust region subproblem solver (the output crvmin is calculated correctly) - this has minimal
impact on the performance of Py-BOBYQA.
6.3 Version 1.0.2 (20 Jun 2018)
Extra optional input args which passes through arguments for objfun (pull request from logangrado).
Bug fixes: default parameters for reduced initialization cost regime, returning correct value from safety steps,
retrieving dependencies during installation.
6.4 Version 1.1 (24 Dec 2018)
Extra parameters to control the trust region radius over multiple restarts, designed for global optimization.
New input flag seek_global_minimum to set sensible default parameters for global optimization. New
example script to demonstrate this functionality.
Bug fix: default trust region radius when scaling variables within bounds.
Initially released as version 1.1a0 on 17 Jul 2018.
21
Py-BOBYQA Documentation, Release 1.1
22 Chapter 6. Version History
CHAPTER
SEVEN
ACKNOWLEDGEMENTS
This software was developed under the supervision of Coralia Cartis, and was supported by the EPSRC Centre For
Doctoral Training in Industrially Focused Mathematical Modelling (EP/L015803/1) in collaboration with the Numer-
ical Algorithms Group.
23
Py-BOBYQA Documentation, Release 1.1
24 Chapter 7. Acknowledgements
BIBLIOGRAPHY
[CFMR2018] 3. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-
Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018).
[Powell2009] 13. (a) i. Powell, The BOBYQA algorithm for bound constrained optimization without deriva-
tives, technical report DAMTP 2009/NA06, University of Cambridge, (2009).
[CFMR2018] 3. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-
Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018).
[CFMR2018] 3. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-
Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018).
25

Navigation menu