Methods Used by the Optimization Package
This help page provides an overview of the methods used by the Optimization package. The exports in the Optimization package select the methods according to criteria described below. Some exports allow the specification of a particular method using the method option. When using an Optimization export, set infolevel[Optimization] to 1 or higher to display the name of the method being used.
The methods used by the LPSolve, QPSolve, NLPSolve and LSSolve commands (for linear programs, quadratic programs, nonlinear programs, and least-squares problems respectively) are described here. These methods are also used by the Minimize, Maximize, and Interactive commands, which call the specialized commands based on the input received.
In the following descriptions, bounded signifies having explicit bounds, while constrained signifies having general linear or nonlinear constraints. The Optimization commands treat constraints and simple bounds differently. For more efficient performance, specify bounds separately, not in the constraint set. The final objective function value should be the same in all cases, but constrained and simply bounded nonlinear programs are solved using different methods.
For more information about the various options mentioned below, see the Optimization/Options help page.
Linear and Quadratic ProgramsNonlinear ProgramsLeast-Squares Problems
<Text-field style="Heading 2" layout="Heading 2" bookmark="bkmrk0">Linear and Quadratic Programs</Text-field>
Linear and quadratic programs are solved by the LPSolve and QPSolve commands respectively. The method option is available for continuous linear programs, but neither for quadratic nor integer linear programs.
Continuous programs are solved using one of two methods. The active-set method, featuring a feasibility phase followed by an optimality phase, is available for both linear and quadratic continuous programs.
For linear continuous problems in particular, a sparse interior point method is also available. The method may be specified using either the NiMvSSdtZXRob2RHNiJJKmFjdGl2ZXNldEdGJQ== or NiMvSSdtZXRob2RHNiJJLmludGVyaW9ycG9pbnRHRiU= option. If no method is specified, a heuristic will be used to choose the method. In general, the interior point method will be more efficient for large, sparse problems. If the input is given in Matrix form and the constraint matrices have the storage=sparse option set, the interior point method will be used. Otherwise, the heuristic is based on the number of variables, constraints, and the density of the constraint coefficient matrices.
Integer programs, accepted by the LPSolve command, are solved with a branch-and-bound algorithm. The continuous subproblems are solved with the active-set method. Integer programs are specified using the assume, binaryvariables or integervariables option.
<Text-field style="Heading 2" layout="Heading 2" bookmark="bkmrk1">Nonlinear Programs</Text-field>
Nonlinear programs are solved by the NLPSolve command. This command accepts the method option. If the optimization problem does not satisfy the conditions of the requested method, an error is issued.
The methods used by the NLPSolve command are described below. They are listed in order of increasing generality. The methods listed first apply to the most restricted problem classes.
Quadratic Interpolation -- This method is available for univariate problems with no general constraints but with finite bounds. To select it, use the method=quadratic option. The method assumes the objective function has a continuous first derivative but does not require that derivatives be explicitly provided. Any initial point provided through the initialpoint option is ignored.
Branch-and-Bound (Global Search) -- This method is available for univariate problems with no general constraints but with finite bounds. To select it, use the NiMvSSdtZXRob2RHNiJJL2JyYW5jaGFuZGJvdW5kR0Yl option. This method differs from others in the Optimization package in that a global search is performed. It initially uses a branch-and-bound algorithm, which assumes Lipschitz continuity. The solution is further refined with a local search.
Modified Newton -- The modified Newton method is available for unconstrained or finitely bounded problems. To select it, use the NiMvSSdtZXRob2RHNiJJL21vZGlmaWVkbmV3dG9uR0Yl option. The method uses the gradient of the objective function, which is computed automatically with algebraic or operator forms but must be provided explicitly with Matrix form.
Nonlinear Simplex -- The nonlinear Simplex, or Nelder-Mead, method is available for problems having no bounds or general constraints. To select it, use the NiMvSSdtZXRob2RHNiJJMW5vbmxpbmVhcnNpbXBsZXhHRiU= option. This method, which does not require derivatives, tends to be slow but robust so it is appropriate for objective functions that are prone to inaccuracies. For best performance, the optimization problem should be scaled so that the solution values are of order unity.
Preconditioned Conjugate Gradient (PCG) -- The preconditioned limited-memory quasi-Newton conjugate gradient method is available for problems having no bounds or general constraints. To select it, use the NiMvSSdtZXRob2RHNiJJJHBjZ0dGJQ== option. The method uses the gradient of the objective function, which is computed automatically with algebraic or operator forms but must be provided explicitly with Matrix form.
Sequential Quadratic Programming (SQP) -- The SQP method is available for arbitrary unconstrained or constrained nonlinear programs. To select it, use the NiMvSSdtZXRob2RHNiJJJHNxcEdGJQ== option. The method uses derivatives automatically computed by Maple or explicitly provided through the options. If these are not available, numerical derivatives are computed.
If the method option is not provided, the method is chosen by NLPSolve according to the following rules. If the optimization problem is univariate and unconstrained except for finite bounds, quadratic interpolation is used. If the problem is unconstrained and the gradient of the objective function is available, the PCG method is used. Otherwise, the SQP method is used.
<Text-field style="Heading 2" layout="Heading 2" bookmark="bkmrk2">Least-Squares Problems</Text-field>
Least-squares problems are solved by the LSSolve command. When the residuals in the objective function and the constraints are all linear, then an active-set method is used. When the problem is nonlinear, the method option can be used to specify a particular method. If the optimization problem does not satisfy the conditions of the requested method, then an error is issued.
The methods used by the LSSolve command for nonlinear problems are described below.
Gauss-Newton and Modified Newton -- This method, based on a combined Gauss-Newton and modified Newton algorithm, is available for problems having no bounds or general constraints. It does not require derivatives. To select it, use the NiMvSSdtZXRob2RHNiJJL21vZGlmaWVkbmV3dG9uR0Yl option.
Sequential Quadratic Programming (SQP) -- The SQP method is available for arbitrary unconstrained or constrained nonlinear least-squares problems. To select it, use the NiMvSSdtZXRob2RHNiJJJHNxcEdGJQ== option. The method uses derivatives automatically computed by Maple or explicitly provided through the options. If these are not available, numerical derivatives are computed.
If the method option is not provided, the method is chosen by LSSolve according to the following rules. If the optimization problem is unconstrained and unbounded, then the modified Newton method is used. Otherwise, the SQP method is used.
See AlsoOptimizationOptimization/Options