# cobyqa.subproblems.bound_constrained_tangential_step#

subproblems.bound_constrained_tangential_step(grad, hess_prod, xl, xu, delta, debug, **kwargs)[source]#

Minimize approximately a quadratic function subject to bound constraints in a trust region.

This function solves approximately

\begin{split}\begin{aligned} \min_{d \in \R^n} & \quad g^{\T}d + \frac{1}{2} d^{\T}Hd\\ \text{s.t.} & \quad l \le d \le u,\\ & \quad \norm{d} \le \Delta, \end{aligned}\end{split}

using an active-set variation of the truncated conjugate gradient method.

Parameters:

Gradient $$g$$ as shown above.

hess_prodcallable

Product of the Hessian matrix $$H$$ with any vector.

hess_prod(d) -> numpy.ndarray, shape (n,)

returns the product $$Hd$$.

xlnumpy.ndarray, shape (n,)

Lower bounds $$l$$ as shown above.

xunumpy.ndarray, shape (n,)

Upper bounds $$u$$ as shown above.

deltafloat

Trust-region radius $$\Delta$$ as shown above.

debugbool

Whether to make debugging tests during the execution.

Returns:
numpy.ndarray, shape (n,)

Approximate solution $$d$$.

Other Parameters:
improvebool, optional

If True, a solution generated by the truncated conjugate gradient method that is on the boundary of the trust region is improved by moving around the trust-region boundary on the two-dimensional space spanned by the solution and the gradient of the quadratic function at the solution (default is True).

Notes

This function implements Algorithm 6.2 of . It is assumed that the origin is feasible with respect to the bound constraints xl and xu, and that delta is finite and positive.

References



T. M. Ragonneau. “Model-Based Derivative-Free Optimization Methods and Software.” Ph.D. thesis. Hong Kong: Department of Applied Mathematics, The Hong Kong Polytechnic University, 2022.