Académique Documents
Professionnel Documents
Culture Documents
Consider the problem of finding a set of values [x1, x2] that solves
minimize f ( x ) = ex1 (4 x12 + 2 x22 + 4 x1 x 2 + 2 x2 + 1)
x
subject to
x1 x2 − x1 − x2 − 1.5 ≤ 0
− x1 x 2 − 10 ≤ 0
To solve this two-dimensional problem,
• Write a M-file, obj_fun.m, that specifies the objective function and its gradient.
• Create another M-file, con_fun.m,that specifies the nonlinear constraints and their
gradients.
• The constrained minimization routine, fmincon, is then invoked.
end
Invoke one of the unconstrained optimization routines
% ------ Solve the optimization problem using SQP method --------
options = optimset('Display','off',...
'LargeScale','off', ...
'GradObj','on',...
'Hessian','off',...
'GradConstr','on', ...
'TolCon',1e-8, ...
'TolFun',1e-8, ...
'TolX',1e-8);
The output structure gives more details about the optimization. For fmincon, it includes the
number of iterations in iterations, the number of function evaluations in funcCount, the final
step-size in stepsize, a measure of first-order optimality in firstorderopt, and the type of
algorithm used in algorithm:
output =
iterations: 12
funecount: 35
stepsize: 1
firstorderopt: []
algorithm: 'medium-scale: Quasi-Newton line search'
When there exists more than one local minimum, the initial guess for the vector [xl, x2] affects
both the number of function evaluations and the value of the solution point. In the example above, x0
is initialized to [-1, 1].
The variable options can be passed to fmincon to change characteristics of the optimization
algorithm, as in
x = fmincon('obj_fun', x0, options);
Options is a structure that contains values for termination tolerances and algorithm choices. An
options structure can be created using the optimset function
options = optimset('Display','off',...
'LargeScale','off', ...
'GradObj','on',...
'Hessian','off',...
'GradConstr','on');
In this example we have turned off the default selection of the large-scale algorithm and so the
medium-scale algorithm is used. Option ‘display’ includes controlling the amount of command line
shown during the optimization iteration. When ‘gradobj’ is on, the optimization program will use the
gradient formula provided. When ‘Hessian’ is off, Hessian matrix will be calculated numerically.