cobyqa.subsolvers.tangential_byrd_omojokun#
- cobyqa.subsolvers.tangential_byrd_omojokun(grad, hess_prod, xl, xu, delta, debug, **kwargs)[source]#
Minimize approximately a quadratic function subject to bound constraints in a trust region.
This function solves approximately
\[\begin{split}\min_{s \in \R^n} \quad \transpose{g} s + \frac{1}{2} \transpose{s} H s \quad \text{s.t.} \quad \left\{ \begin{array}{l} \xl \le s \le \xu,\\ \lVert s \rVert \le \Delta, \end{array} \right.\end{split}\]using an active-set variation of the truncated conjugate gradient method.
- Parameters:
- gradnumpy.ndarray, shape (n,)
Gradient \(g\) as shown above.
- hess_prodcallable
Product of the Hessian matrix \(H\) with any vector.
hess_prod(s) -> numpy.ndarray, shape (n,)
returns the product \(H s\).
- xlnumpy.ndarray, shape (n,)
Lower bounds \(\xl\) as shown above.
- xunumpy.ndarray, shape (n,)
Upper bounds \(\xu\) as shown above.
- deltafloat
Trust-region radius \(\Delta\) as shown above.
- debugbool
Whether to make debugging tests during the execution.
- Returns:
- numpy.ndarray, shape (n,)
Approximate solution \(s\).
- Other Parameters:
- improvebool, optional
If True, a solution generated by the truncated conjugate gradient method that is on the boundary of the trust region is improved by moving around the trust-region boundary on the two-dimensional space spanned by the solution and the gradient of the quadratic function at the solution (default is True).
Notes
This function implements Algorithm 6.2 of [1]. It is assumed that the origin is feasible with respect to the bound constraints and that delta is finite and positive.
References
[1]T. M. Ragonneau. Model-Based Derivative-Free Optimization Methods and Software. PhD thesis, The Hong Kong Polytechnic University, Hong Kong, China, 2022.