subproblems.linearly_constrained_tangential_step(grad, hess_prod, xl, xu, aub, bub, aeq, delta, debug, **kwargs)[source]#

Minimize approximately a quadratic function subject to bound and linear constraints in a trust region.

This function solves approximately

\[\begin{split}\begin{aligned} \min_{d \in \R^n} & \quad g^{\T}d + \frac{1}{2} d^{\T}Hd\\ \text{s.t.} & \quad l \le d \le u,\\ & \quad A_{\text{ub}}d \le b_{\text{ub}},\\ & \quad A_{\text{eq}}d = 0,\\ & \quad \norm{d} \le \Delta, \end{aligned}\end{split}\]

using an active-set variation of the truncated conjugate gradient method.

gradnumpy.ndarray, shape (n,)

Gradient \(g\) as shown above.


Product of the Hessian matrix \(H\) with any vector.

hess_prod(d) -> numpy.ndarray, shape (n,)

returns the product \(Hd\).

xlnumpy.ndarray, shape (n,)

Lower bounds \(l\) as shown above.

xunumpy.ndarray, shape (n,)

Upper bounds \(u\) as shown above.

aubnumpy.ndarray, shape (m_linear_ub, n)

Coefficient matrix \(A_{\text{ub}}\) as shown above.

bubnumpy.ndarray, shape (m_linear_ub,)

Right-hand side \(b_{\text{ub}}\) as shown above.

aeqnumpy.ndarray, shape (m_linear_eq, n)

Coefficient matrix \(A_{\text{eq}}\) as shown above.


Trust-region radius \(\Delta\) as shown above.


Whether to make debugging tests during the execution.

numpy.ndarray, shape (n,)

Approximate solution \(d\).

Other Parameters:
improvebool, optional

If True, a solution generated by the truncated conjugate gradient method that is on the boundary of the trust region is improved by moving around the trust-region boundary on the two-dimensional space spanned by the solution and the gradient of the quadratic function at the solution (default is True).


It is assumed that the origin is feasible with respect to the bound and the linear constraints xl and xu, and that delta is finite and positive.