Usage#
We provide below basic usage information on how to use COBYQA.
For more details on the signature of the minimize
function, please refer to the API documentation.
How to use COBYQA#
COBYQA provides a minimize
function.
This is the entry point to the solver.
It solves unconstrained, bound-constrained, linearly constrained, and nonlinearly constrained optimization problems.
We provide below simple examples on how to use COBYQA.
Examples#
Example of unconstrained optimization#
Let us first minimize the Rosenbrock function implemented in scipy.optimize
, defined as
for \(x \in \mathbb{R}^n\). To solve the problem using COBYQA, run:
from cobyqa import minimize
from scipy.optimize import rosen
x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
res = minimize(rosen, x0)
print(res.x)
This should display the desired output [1. 1. 1. 1. 1.]
.
Example of linearly constrained optimization#
To see how bound and linear constraints are handled using minimize
, let us solve Example 16.4 of [UU1], defined as
To solve the problem using COBYQA, run:
import numpy as np
from cobyqa import minimize
from scipy.optimize import Bounds, LinearConstraint
def fun(x):
return (x[0] - 1.0) ** 2.0 + (x[1] - 2.5) ** 2.0
x0 = [2.0, 0.0]
bounds = Bounds([0.0, 0.0], np.inf)
constraints = LinearConstraint([[-1.0, 2.0], [1.0, 2.0], [1.0, -2.0]], -np.inf, [2.0, 6.0, 2.0])
res = minimize(fun, x0, bounds=bounds, constraints=constraints)
print(res.x)
This should display the desired output [1.4 1.7]
.
Example of nonlinearly constrained optimization#
To see how nonlinear constraints are handled, we solve Problem (F) of [UU2], defined as
To solve the problem using COBYQA, run:
import numpy as np
from cobyqa import minimize
from scipy.optimize import NonlinearConstraint
def fun(x):
return -x[0] - x[1]
x0 = [1.0, 1.0]
constraints = NonlinearConstraint(lambda x: [
x[0] ** 2.0 - x[1],
x[0] ** 2.0 + x[1] ** 2.0,
], -np.inf, [0.0, 1.0])
res = minimize(fun, x0, constraints=constraints)
print(res.x)
This should display the desired output [0.7071 0.7071]
.
Finally, to see how to supply linear and nonlinear constraints simultaneously, we solve Problem (G) of [UU2], defined as
To solve the problem using COBYQA, run:
import numpy as np
from cobyqa import minimize
from scipy.optimize import LinearConstraint, NonlinearConstraint
def fun(x):
return x[2]
def cub(x):
return x[0]**2 + x[1]**2 + 4.0*x[1] - x[2]
x0 = [1.0, 1.0, 1.0]
constraints = [
LinearConstraint([
[5.0, -1.0, 1.0],
[-5.0, -1.0, 1.0],
], [0.0, 0.0], np.inf),
NonlinearConstraint(cub, -np.inf, 0.0),
]
res = minimize(fun, x0, constraints=constraints)
print(res.x)
This should display the desired output [0., -3., -3.]
.
References
J. Nocedal and S. J. Wright. Numerical Optimization. Springer Ser. Oper. Res. Financ. Eng. Springer, New York, NY, USA, second edition, 2006. doi:10.1007/978-0-387-40065-5.
M. J. D. Powell. A direct search optimization method that models the objective and constraint functions by linear interpolation. In S. Gomez and J.-P. Hennart, editors, Advances in Optimization and Numerical Analysis, volume 275 of Math. Appl., pages 51–67. Springer, Dordrecht, Netherlands, 1994. doi:10.1007/978-94-015-8330-5_4.