Académique Documents
Professionnel Documents
Culture Documents
2. BASIC PRINCIPLES
2.1 Introduction
2.2 Gradient Information
4. ONE-DIMENSIONAL OPTIMIZATION
4.3 Fibonacci Search
4.4 Golden-Section Search2
4.5 Quadratic Interpolation Method
QUASI-NEWTON METHODS
7.1 Introduction
7.2 The Basic Quasi-Newton Approach
MINIMAX METHODS
8.1 Introduction
8.2 Problem Formulation
8.3 Minimax Algorithms
• equality:
• nonequality:
• NEOS Guide
www-fp.mcs.anl.gov/OTC/Guide/
• Conjugate Gradient;
• Quasi-Newton; Minimax;
• Lagrange Multiplier; Simplex; Prinal-Dual,
Quadratic programming; Semi-definite;
Types of minima
weak strong
strong
local local
local
f(x) minimum
minimum strong minimum
global
minimum
feasible region x
XL XU RANGE
0 1 1
1/2 1 1/2
1/2 3/4 1/4
1/2 5/8 1/8
9/16 5/8 1/16
9/16 19/32 1/32
Optimization Methods
Search methods
xU xL
xL xL
xU xU
Dichotomous
1 2 3 5 8
Fibonacci: xU Golden-Section Search
1 1 2 3 5 8… xL xL xU divides intervals by
Ik+5 Ik+4 Ik+3 Ik+2 Ik+1 Ik K = 1.6180
1-D search
1D function
As an example consider the function
(assume we do not know the actual function expression from now on)
Gradient descent
Given a starting location, x0, examine df/dx
and move in the downhill direction
to generate a new estimate, x1 = x0 + δx
• Update x.
Newton method
• Start at x0, k = 0.
k = k+1
3. Update xk = xk + αk pk
δG ∇ 2G ( x ) =
( σ ) − 1] exp( −
[ x
2
x2
)
2π σ 23 2σ 2
2
g ( x*) = ∇G ( x*) = 0
δ2G
Quadratic functions
with positive
definite
minimum
Examples of quadratic functions
Case 2: eigenvalues have different sign
with indefinite
saddle point
Examples of quadratic functions
Case 3: one eigenvalues is zero
with positive
semidefinite
parabolic cylinder
Optimization for quadratic functions
Assume that H is positive definite
Prove it!
Steepest descent
Minimum at [1, 1]
Conjugate gradient
• Set H0 = I.
• Update according to
where
= + If <∈, then
Find , the value = ; =− ∗
= ,
of that minimizes = − ∗
=
Quasi- ( + ), using /∗ = + ∗/ Else = + 1
Newton
Method
line search =
( − )( − )
= +
( − )
= −
Non-linear least squares
• Consider
• Hence
Non-linear least squares
• For the Hessian holds
Gauss-Newton
approximation
CG Newton
Quasi-Newton Gauss-Newton
Simplex
Constrained Optimization
Subject to:
• Equality constraints:
• Nonequality constraints:
f(x) = 3
Gradients of constraints and objective function are linearly independent.
3D Example
f(x) = 1
Gradients of constraints and objective function are linearly dependent.
Inequality constraints
f3 > f2 > f1
Minimize
• Minimize
Subject to:
Minimize
• Minimize
Subject to:
where
subject to:
•
We check if KKT conditions are satisfied
and
•
•
2. Minimize
subject to:
Prove it!
• QP can solve LP.
• If the LP minimizer exists it must be one of the vertices
of the feasible region.
• A fast method that considers vertices is the Simplex
method.