Vous êtes sur la page 1sur 10

Conditions (33) and (34) we can summarizi as follows:

x 0; f (x) 0 and x f (x) = 0 (35)

Figure 11: the boundary solution; the global maximum is in the point x = 0.

For the minimisation problem the analogous conditions are


x 0; f (x) 0 and x f (x) = 0 (36)
We can use conditions (35) as follows:
a) from f (x) = 0 we find x = x*, where f (x) = 0; if x* 0, then x* is the possible
solution for the maximisation problem (32);
b) we check the border solution x = 0. If f is decreasing in this point, then x = 0 is the
possible solution of problem (32);
c) if the possible solutions there are more than one, we find the values of f in all pints,
founded by the rules a) and b); the maximum of these values is the optimal solution of
the maximisation problem (32).

Theorem 12. Let f is a stictly globally concave function.


I. If there exists only one solution x* > 0 of the equation f (x) = 0, then x* is the global
maximum of f.
II. If the equation f (x) = 0 has not any positive solution, then x = 0 is optimal or the optimal
solution of (32) does not exist.

Now we study the maximisation of a one-variable decision function with one


inequality constraint:
max y = f ( x)
g ( x) g (37)
x0

Definition 4. The constraint g ( x) g of the problem (37) is called binding, if it will


change max value of f(x) or value of the optimal solution x* for x 0. If these values do not
change, then this constraint is called slack.

57
From Figure 12 we show, that the solutions of the problems (32) and (37) for g = h
coincide (in both cases the optimal solution is x* = a and the maximal value of f is f(a)).
However the optimal solution of the problem (37) for g = g is x* = b with the maximal value
f(b). Thus the constraint g(x) g is binding and the constraint g(x) h is slack. Consequently,
the binding constraint realizes as the equality g(b) = g, the slack constraint realizes as the
strong inequality g(a) < h.
For solving the problem (37) we also use the Lagrange functionn

G(x,,g) = f(x) - [g(x) - g]


as follows:
1. If the constraint g ( x) g is binding, then the maximum points and the maximal values
of G also are the maximum points and the maximal values of f; then f = G (because g(x*) =
g). In this case > 0. Indeed, indicates approximately how much changes the value of f if
g increases by the unit. As increasing g means, that the set of the values of the variable x,
satisfying the constraint g(x) g, increases, then the value of the objective function can not
decrease and therefore > 0.

Figure 12: the binding and slack constraints.


2. If the constraint g ( x) g is slack, i.e really g ( x*) = g , then for f = G it is necessary,
that = 0.
Summarizing these moments, we can assert that the values of x and , maximizing the
objective function, must satisfy one of the following conditionss:
1) > 0 and the constraint g ( x) g is binding, i.e g(x*) = g,
2) = 0 and constraint g ( x) g is slack g(x*) < g.
Conditions 1) and 2) we can summarize as follows:

58
0; G ' 0; G ' = 0 (38)
Similarly to (35) we get that G has the maximum with respect to x, if
x 0; G 'x 0; xG 'x = 0 (39)
Conditions (38) and (39) are called the Kuhn-Tucker syatem or Kuhn-Tucker
conditions for maximisation problem (37).
The Kuhn-Tucker conditions for the minimisation problem
min y = f ( x)
g ( x) g (40)
x0
are
0; G ' 0; G ' = 0
x 0; G 'x 0; xG 'x = 0
However, for solving problem (40) it is possible to use another method: first transform the
given minimisation problem to the maximisation problem, using the relation
min f ( x) = max ( f ( x)), after that to solve the obtained maximisation problem
min f ( x) = max ( f ( x))
g ( x) g
x0
Applying the Kuhn-Tucker system we have four different possibilities:
(a) x = 0, = 0 the border solution, theslack constraint,
(b) x = 0, > 0 the border solution, the. binding constraint,
(c) x > 0, = 0 the interior solution, the slack constraint,
(d) x > 0, > 0 the interior solution, the binding constraint.
In the case (d) the given problem is equivalent to the equality constrained optimisation
problem
The Kuhn-Tucker system we can generalize also for the several-variable problem:
max f ( x1 ,, x n )
(41)
g j ( x1 ,, x n ) g , j = 1,2,, m
Then the Kuhn-Tucker system for the problem (41) is
j 0; G ' j 0; j G ' j = 0, j = 1,,n
x i 0; G 'x i 0; x i G 'x i = 0, i = 1,,m
where the Lagrange function is
G(x,) = f(x) + 1 (g 1 - g 1 (x)) + + m (g m - g m (x)); = ( 1 , m ), x = (x 1 ,,x n )

The necessary conditions for the Kuhn-Tucker system are


1. all functions g j are linear
or
2. all functions g j are convex for x 0 and there exists x so that g j (x) < g j , j =
1,2,,m.

59
Remark 1 If the necessary conditions for the Kuhn-Tucker system are fulfilled, then we
can look for the optimal solution only from the set of solutions of the Kuhn-Tucker system or
in other words: if x* is an optimal solution of the optimisation problem, then x* is the solution
of the Kuhn-Tucker system

The sufficient conditions for the Kuhn-Tucker system are : the function f is concave and
functionss g j are convex for x 0.

Remark 2. If the sufficient conditions for the Kuhn-Tucker system are fulfilled, then the
solution of the Kuhn-Tucker system is the optimal solution of maximisation problem.

Solving the inequality constrained optimisation problem we started from the


corresponding equality constrained problem, assuming for a moment that the inequalities
realize as equalities. After we consider the other possibilities.
Example 30. To solve the problem
max z = xy(9 x y)
x + y 5,
x, y 0.
Solution. At first we notice, that the constraint is linear and therefore the necessary conditions
for the Kuhn-Tucker system are satisfied. As the domain of points (x,y), defined by the
constraints x + y 5, x,y 0 is bounded and closed and f is continuous, then there exist the
extreme points of f. The Lagrange function is
G = xy(9 x y) + (5 x y).
As
G 'x = y(9 x y) xy -
G 'y = x(9 x y) xy -
G ' = (5 x y),
then the Kuhn-Tucker system consists of the relations
x[y(9 x y) xy - ] = 0 y(9 x y) xy - 0
y[x(9 x y) xy - ] = 0 x(9 x y) xy - 0
(5 x y) = 0 5 x y 0, x, y, 0.
Now we need to solve the system of equation, consisting of three equality constraints of the
Kuhn-Tucker system:
x[ y (9 x y ) xy ] = 0

y[x(9 x y ) xy ] = 0
(5 x y ) = 0

This system is equivalent to eight simpler system of equations

[ y (9 x y ) xy ] = 0 [ y (9 x y ) xy ] = 0

1) [x(9 x y ) xy ] = 0 2) [x(9 x y ) xy ] = 0
5 x y = 0 = 0

60
x[ y (9 x y ) xy ] = 0 x[ y (9 x y ) xy ] = 0

3) y = 0 4) y = 0
5 x y = 0 = 0

x = 0 x = 0

5) x(9 x y ) xy = 0 6) x(9 x y ) xy = 0
5 x y = 0 = 0

x = 0 x = 0

7) y = 0 8) y = 0
5 x y = 0 = 0

However, we have not needness to solve all eight equations. We started with Lagrange
method for equality-constrained maximisation problem. Hence we assume, that x,y > 0 and
the constraint is binding.:Then we get the system 1). From the first and the second equation of
this syatem we have
= y(9 x y) xy = x(9 x y) xy
y(9 x y) = x(9 x y)
x + y = 9 or x = y.

The first equality x + y = 9 does not satisfy the constraint x + y 5. The second equality
x = y gives us
5 2x = 0 x* = 2,5 y*= 2,5.

by the third equation of 1). Now from the first equation of 1) we find * = 3,75. We see that
the Kuhn-Tucker conditions are satisfied: x*, y*, * > 0, all derivatives are equal to 0. As
* > 0, then the constraint is binding. We calculate z(2,5;2,5) = 25. Further we notice that
x* = 0 or y* = 0 in the systems 3) -8) and z = 0 < z(2,5;2,5) = 25 in all points, satisfying these
systems. We have from the system 2) that

[ y (9 x y ) xy ] = 0

[x(9 x y ) xy ] = 0

The solution of this system, where both x and y are equal to 0, is x = y = 3, but this solution
does not satisfy the constraint x + y 5. Therefore the optimal solution is x* = y*= 2,5.

Remark 3 For solving Example.30 it is not possible to use Remark 2, because the function
z = xy(9 x y) is not globally concave:
f ''
xx = -2y, f ''
yy = 2x, f ''
xy = 9 -2x -2y, = 4xy (9 -2x -2y) 2 ;

thus the sign of is not uniquely determined for all x, y 0.

Example 31. To solve the problem


max z = xy(9 x y)
x + y 8,

61
x, y 0.
Solution. The Lagrange function is
G = xy(9 x y) + (8 x y)
and assuming for a moment that variables x and y be positive and the constraint is binding, we
get the system
G 'x = y(9 x y) xy - = 0
G 'y = x(9 x y) xy - = 0
G ' = (8 x y) = 0
Solving this system, we get x* = y* = 4 and * = -12. But the negative * is in contradiction
to the Kuhn-Tucker conditions. For ohter cases with non-zero we get z = 0, since x* = 0 or
y* = 0. We examine the case x > 0, y > 0 and = 0. Then we have

y(9 x y) xy = 0
x(9 x y) xy = 0
Solving this system, we get x* = y* = 3. This solution satisfies the constraint G ' 0, because
x* + y* = 3 + 3 = 6 8. The other canditates for = 0 are the points, where x or y is equal to
0; but in these cases f(x, y) = 0 < f(3,3) = 27. Thus the optimal solution is x* = y* = 3.
Example 32 (see [3], 313). Suppose that rice and fish are only foods available. It is necessary
to obtain at least 1000 calories, 1,25 grams of sodium (Na), 25 grams of protein and 50 grams
of carbohydrates. In the next table are presented the quantities of each above-mentioned
components, provided by one portion of rice and one portion of fish:
rice fish
Energy (cal) 70 120
Sodium (Na) 0 0,25
Protein 3 15
Carbohydrates 26 0

The portion of rice costs 7 cents and portion of fish 30 cents. Find minimum-cost way of
satisfying all requirements and find shadow prices and explain, what it means.
Solution. Let x be the number of rice portions and y the number of fish portions.Then we can
formulate this problem mathematically so:
min z = 7x + 30y,
s.t. 70x + 120y 1000 (En)
0,25y 1,25 (Na)
3x + 15y 25 (Prot)
26x + 50 (Ch)

For solving we can use the simplex method or the graphical method. We apply the graphical
method The feasible set is the shadded are in Figure 13. The lowest level curve of the

62
Figure 13: the feasible set of problem, posed in Example 32.

objective function, having the common point (denoted by A on Figure 13) with the feasible set
is 7x + 30y = 190. The level curves 7x + 30y = C with C > 190 are above the above-
mentioned level curve. Therefore the optimal solution is point A. The coordinates of A we can
find from the system
70x + 120y = 1000
0,25y = 1,25
Thus y* = 5 and x* = 40/7 6.
For finding shadow prices we use the Kuhn-Tucker conditions. As all the constraints
are linear, then the optimal solution must be the solution of the Kuhn-Tucker system. The
Lagrange function of our problem is
G = 7x + 30y 1 (70x + 120y 1000) 2 (0,25y 1,25)
3 (3x +15y 25) - 4 (26x 50)
and the Kuhn-Tucker system is
G 'x = 7 - 70 1 -3 3 - 26 4 0
G 'y = 30 - 120 1 - 0,25 2 - 15 3 0
x,y, 1 , 2 , 3 , 4 0
xG 'x = x(7 - 70 1 -3 3 - 26 4 ) = 0 (42)
'
yG = y(30 - 120 1 - 0,25 2 - 15 3 ) = 0
y (43)
G 1 = 1000 70x 120y 0
'
1 (1000 70x 120y) = 0
G 2 = 1,25 0,25y 0
'
2 (1,25 0,25y) = 0
G 3 = 25 3x 15y 0
'
3 (25 3x 15y) = 0
G 4 = 50 26x 0
'
4 (50 26x) = 0
As the protein and the carbohydrates constraints are slack (see Figure 13), then we can take
3 = 4 = 0. As x*,y* > 0, we have from (42) and (43) that

63
7 - 70 1 = 0
30 - 120 1 - 0,25 2 = 0

From this system we have 1* = 0,1 and *2 = 72 ; *3 = *4 = 0 . These values are colled the
shadow prices of the components of the food: 1* = 0,1 shows that for increasing the necessity
for the energy by 1 cal we need 0,1 cents, *2 = 72 means, that for increasing the necessity for
Na we need 72 cents, *3 = *4 = 0 shows that for increasing the necessity for the protein and
Ch we need not additional costs.

Example 33 (see [3], 313-314). A small company manufactures two different types of
computers: the simple model (X) and the modified model (Y). The basic model gives p(x) =
(100 x)$ profit per unit and the modified model (500 y)$ per unit, where x and y are the
amounts of the simple and the modified models respectively. The firm employs five workers
who assemble the computers and two workers who run diagnostic tests on the computers
before they are sold. Each employer works 40 hours per week. It takes 1 hour to assemble the
basic model and three hours to assemble the modified model. It takes 1 hour to run diagnostic
tests on the modified model, but only 0,5 hours to the basic model. How many computers of
each type should the firm make to maximize the profit subject to the constraints on assembly
and testing time, if the basic model is produced with cost 10$ per unit while the modified
model is produced with the cost 25 $ per unit?
Solution. The revenue function is (500 y)y + (100 x)x and the cost function is 25y + 10x,
thus the profit function is
K = (500 y)y + (100 x)x 25y 10x = 475y + 90x y 2 x 2
and the number of computers, produced in the first week must, satisfy constraints
x + 3y 200
0,5x + y 80.
Thus we have to solve the problem
max K = 475y + 90x y 2 x 2
x + 3y 200 (44)
0,5x + y 80, (45)
x, y 0.
The Lagrange function is
G = 475y + 90x y 2 x 2 1 (x + 3y 200) 2 (0,5x + y 80)
and the Kuhn-Tucker system is
G 'x = 90 2x 1 0,5 2 0 xG 'x = 0
G 'y = 475 2y 3 1 2 0 yG 'y = 0 x,y, 1 , 0
G '1 = 200 x 3y 0 1 G '1 = 0
G '2 = 80 0,5x y 0 2 G '2 = 0
First we show that the sufficient conditions for the Kuhn-Tucker system are fulfilled. For this
purpose we find that K x' = 90 2 x, K y' = 475 2 y , K xy'' = K yx
''
= 0 and
K xx'' = K yy'' 2 < 0, = 4 > 0 for every ( x, y )

64
Hence K ( x, y ) is the strictly concave function everywhere in his domain by Theorem 8. As
all the constraints of our problem are linear, then the functions in these constraints are also
convex. Thus the sufficient conditions for the Kuhn-Tucker system are fulfilled It means that
if we have found one solution of the Kuhn-Tucker system, we hve not needness to look for
the other solutions of the Kuhn-Tucker system, because the above-mentioned solution is the
unique optimal solution of our problem
Now we assume that x, y, 1 , 2 > 0, i.e. all the constraints of our problem are binding.
Then G 'x = G 'y = G '1 = G '2 = 0 or

90 2x 1 0,5 2 = 0
475 2y 3 1 2 = 0
200 x 3y = 0
80 0,5x y = 0
Solving this system, we have x = 80, y = 40, 1 = 535 > 0, 2 = -1210 < 0. We see that this
solution does not satisfy the Kuhn-Tucker system. Thus 1 and 2 cannot be both positive in
the same time, i.e. both constraints cannot be binding. Consequently we need to analyze the
following cases:
I. *2 = 0 ja 1* > 0, i.e. the constraint (44) is binding and the constraint (45) is slack,
II. 1 = 2 = 0, i.e. both constraints are slack,
III. 1 = 0 ja 2 > 0, i.e. the constraint (44) is slack and the constraint (45) is binding.
In the case I it is not difficult to see that for x = y = 0 the Kuhn-Tucker conditions are not
fulfilled, because G '2 = 80 0. In addition to it, we get the next possibilities in the case I:

1) x, y > 0 90 2x 1 = 0
475 2y 3 1 = 0
200 x 3y = 0

2) x = 0; y > 0 475 2y 3 1 = 0
200 3y = 0

3) y = 0; x > 0 90 2x 1 = 0
200 x = 0.
For the possibility 1) we have x = -43/4 < 0, thus the Kuhn-Tucker conditions are not fulfilled.
For the possibility 2) we get y = 200/3, 1 = 1025/9. Hence we have got the solution
x = 0, y = 200/3 = 66 23 (46)
satisfying all the constraints of the Kuhn-Tucker system. Therefore we have not needness to
analyze the possibility 3) and the cases II and III. If the fractional-number solution would be
feasible, then the optimal solution would be given by (46). However, the context of our
problem requires, that the solution must be an integer number. Thus it is possible to produce
66 modified computers and remainder 2/3 means that 2 hours we can use for assembly the
simple computers, i.e. we can produce 2 simple computers.
Value 1 * =1025/9 means that increasing working hours for the assembly computers
by 1 hour, we have 1025/9$ the additional profit, but increasing working houers for the testing
by 1, we will not get the additional profit.

65
Problems

38. To solve with the help of Kuhn-Tucker system:


max z = 10 x 2 y 2 2 y + 2 x + 2
a)
x, y 0, x + y 5
max z = 1 + xy 0,5 x 2 y 2 2 y max z = 4 x x 2 y 2 2 y 1
b) c)
x, y 0, 6 x + 2 y 2 x, y 0, x + 3 y 3

min z = 2 x 2 + y 2 xy y x min z = x 2 + y 2 + 4 x 8 y + 20
d) e)
x, y 0, 2 x + y 1 x, y 0, x + 2 y 1

max z = 6 x + 3 y max z = 50 x + 10 y + 2
f) g)
x, y 0, 3 x + 2 y 3, 4 x + y 2 x, y 0, x y 3, 5 x + 2 y 20

17. Dynamical programming

Dynamical programming is programming problem, where finding variables take place


step-by-step. There exist cases, where after finding the value of one variable, it is possible to
get the additional information for finding next variables. Such problems arise with the
programming of stohhastic processes, changing in time (the additional information may be
some certain values of randomly changing variables). Then it is not reasonable to find the
values of all variables in the start point of the process, because later we have more
information.
Such step-by-step solving may be useful also for the completely determined processes,
because finding values of great amount of variables is often technically not solvable or very
troublesome. Reformulating problem so, that we can consider it as the step-by-step problem,
and in each step we must find the values of less amount of variables, we can get the sequence
of solvable problems. First method for solving dynamic programming problem belongs to
Bellmann.

The principle of Bellmann. Let (y1, y2, , yn) be the optimal program for the nstep
process. The optimal program has the property, that for each i = 1,2,,n -1 the complect
(y i +1 , , y n ) is the optimal program for the (n i)-step process.

We consider the following problem, presented in [10], 448-453. Let be given the
quantity x 1 0, which can be divide into two parts by the decision of the plan-maker
(quantity of capital for two branch of production, for example):
y 1 0 and x 1 - y 1 0
The revenue, which we get from the part y 1 , let be defined by the function g(y 1 ) and the
revenue from the part x 1 - y 1 by the function h(x 1 - y 1 ).

I step. The best dividing program is such, for which the total revenue
S 1 (x 1 , y 1 ) = g(y 1 ) + h(x 1 - y 1 )

66

Vous aimerez peut-être aussi