nlps_knitro
- nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn, opt)
nlps_knitro()
- Nonlinear programming (NLP) Solver based on Artelys Knitro.[X, F, EXITFLAG, OUTPUT, LAMBDA] = ... NLPS_KNITRO(F_FCN, X0, A, L, U, XMIN, XMAX, GH_FCN, HESS_FCN, OPT) [X, F, EXITFLAG, OUTPUT, LAMBDA] = NLPS_KNITRO(PROBLEM) A wrapper function providing a standardized interface for using Artelys Knitro to solve the following NLP (nonlinear programming) problem: Minimize a function F(X) beginning from a starting point X0, subject to optional linear and nonlinear constraints and variable bounds. min F(X) X subject to G(X) = 0 (nonlinear equalities) H(X) <= 0 (nonlinear inequalities) L <= A*X <= U (linear constraints) XMIN <= X <= XMAX (variable bounds) Inputs (all optional except F_FCN and X0): F_FCN : handle to function that evaluates the objective function, its gradients and Hessian for a given value of X. If there are nonlinear constraints, the Hessian information is provided by the HESS_FCN function passed in the 9th argument and is not required here. Calling syntax for this function: [F, DF, D2F] = F_FCN(X) X0 : starting value of optimization vector X A, L, U : define the optional linear constraints. Default values for the elements of L and U are -Inf and Inf, respectively. XMIN, XMAX : optional lower and upper bounds on the X variables, defaults are -Inf and Inf, respectively. GH_FCN : handle to function that evaluates the optional nonlinear constraints and their gradients for a given value of X. Calling syntax for this function is: [H, G, DH, DG] = GH_FCN(X) where the columns of DH and DG are the gradients of the corresponding elements of H and G, i.e. DH and DG are transposes of the Jacobians of H and G, respectively. HESS_FCN : handle to function that computes the Hessian of the Lagrangian for given values of X, lambda and mu, where lambda and mu are the multipliers on the equality and inequality constraints, g and h, respectively. The calling syntax for this function is: LXX = HESS_FCN(X, LAM) where lambda = LAM.eqnonlin and mu = LAM.ineqnonlin. OPT : optional options structure with the following fields, all of which are also optional (default values shown in parentheses) verbose (0) - controls level of progress output displayed 0 = no progress output 1 = some progress output 2 = verbose progress output knitro_opt - options struct for Artelys Knitro, value in verbose overrides these options opts - struct of other values to be passed directly to Aretelys Knitro via an options file tol_x - termination tol on x tol_f - termination tol on f maxit - maximum number of iterations PROBLEM : The inputs can alternatively be supplied in a single PROBLEM struct with fields corresponding to the input arguments described above: f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn, opt Outputs: X : solution vector F : final objective function value EXITFLAG : exit flag 1 = converged negative values = Artelys Knitro specific failure codes (see KNITROMATLAB documentation for details) https://www.artelys.com/docs/knitro/3_referenceManual/knitromatlabReference.html#return-codes-exit-flags OUTPUT : KNITROMATLAB output struct (see KNITROMATLAB documentation for details) https://www.artelys.com/docs/knitro/3_referenceManual/knitromatlabReference.html#output-structure-fields iterations - number of iterations performed funcCount - number of function evaluations constrviolation - maximum of constraint violations firstorderopt - measure of first-order optimality LAMBDA : struct containing the Langrange and Kuhn-Tucker multipliers on the constraints, with fields: eqnonlin - nonlinear equality constraints ineqnonlin - nonlinear inequality constraints mu_l - lower (left-hand) limit on linear constraints mu_u - upper (right-hand) limit on linear constraints lower - lower bound on optimization variables upper - upper bound on optimization variables Note the calling syntax is almost identical to that of FMINCON from MathWorks' Optimization Toolbox. The main difference is that the linear constraints are specified with A, L, U instead of A, B, Aeq, Beq. The functions for evaluating the objective function, constraints and Hessian are identical. Calling syntax options: [x, f, exitflag, output, lambda] = ... nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn, opt); x = nlps_knitro(f_fcn, x0); x = nlps_knitro(f_fcn, x0, A, l); x = nlps_knitro(f_fcn, x0, A, l, u); x = nlps_knitro(f_fcn, x0, A, l, u, xmin); x = nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax); x = nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax, gh_fcn); x = nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn); x = nlps_knitro(f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn, opt); x = nlps_knitro(problem); where problem is a struct with fields: f_fcn, x0, A, l, u, xmin, xmax, gh_fcn, hess_fcn, opt all fields except 'f_fcn' and 'x0' are optional x = nlps_knitro(...); [x, f] = nlps_knitro(...); [x, f, exitflag] = nlps_knitro(...); [x, f, exitflag, output] = nlps_knitro(...); [x, f, exitflag, output, lambda] = nlps_knitro(...); Example: (problem from https://en.wikipedia.org/wiki/Nonlinear_programming) function [f, df, d2f] = f2(x) f = -x(1)*x(2) - x(2)*x(3); if nargout > 1 %% gradient is required df = -[x(2); x(1)+x(3); x(2)]; if nargout > 2 %% Hessian is required d2f = -[0 1 0; 1 0 1; 0 1 0]; %% actually not used since end %% 'hess_fcn' is provided end function [h, g, dh, dg] = gh2(x) h = [ 1 -1 1; 1 1 1] * x.^2 + [-2; -10]; dh = 2 * [x(1) x(1); -x(2) x(2); x(3) x(3)]; g = []; dg = []; function Lxx = hess2(x, lam, cost_mult) if nargin < 3, cost_mult = 1; end mu = lam.ineqnonlin; Lxx = cost_mult * [0 -1 0; -1 0 -1; 0 -1 0] + ... [2*[1 1]*mu 0 0; 0 2*[-1 1]*mu 0; 0 0 2*[1 1]*mu]; problem = struct( ... 'f_fcn', @(x)f2(x), ... 'gh_fcn', @(x)gh2(x), ... 'hess_fcn', @(x, lam, cost_mult)hess2(x, lam, cost_mult), ... 'x0', [1; 1; 0], ... 'opt', struct('verbose', 2) ... ); [x, f, exitflag, output, lambda] = nlps_knitro(problem);
See also
nlps_master()
,knitro_nlp
,knitromatlab
.