Vous êtes sur la page 1sur 12

Operations Research

Dynamic Programming Application

COLOGNE UNIVERSITY OF APPLIED SCIENCES


March 9, 2015
Mohammad Aminul Islam (MatrNo: 01110381212)

Inhalt
1

Introduction ...................................................................................................................... 1

2
Basic Concept about dynamic programmingError!
defined.

Bookmark

not

3
3.1

Application (Shortest Path calculation) ...................................................... 3


Greedy Algorithm ........................................................................................................ 4

3.2

Brute Force ................................................................................................................. 6

3.3
4

Dynamic Programming...............................................................7
Conclusion .............................................................................9

References ....................................................................................................................... 10

- i-

1 Introduction
Dynamic programming is a method for solving difficult problem by breaking it down into
similar smaller problems. To solve a given problem by using dynamic programming need to
divide the problem, solve the different parts and combine the solution. Dynamic programming
overcome the overlapping problem, time consuming problem for shortest path calculation
between two points. Its practical application is huge. It introduces some new techniques such
as memorization, subdivision which have big impact on algorithm design, programming and
many more complex problems. Dynamic programming was introduced by Richard Bellman in
1940 for finding the best solution one after one. Now every programmer, designer considers
this as a technique for problem solution.1

2 Basic Concept about dynamic programming


Now we will see an example for calculating the Fibonacci number with Naive Recursive
algorithm and dynamic programming for getting a basic concept about it.
Normal Fibonacci number
1,1,2,3,5,8,13,21,........................................
In mathematical term Fibonacci represent by
Fn= Fn-1 +Fn-2 and F1=1, F2=1
Naive Recursive Algorithm:
Fib (n):
if n <=2 then f=1
else f= f n-1 +f n-2
Return f;
This is not a good algorithm because exponential time.
T(n)= T(n-1) + T(n-2)
Dynamic Programming:
memo= {}
fib(n):
If n in memo: return n memo [n]
If n<= 2 f=1
Else f= f= f n-1 +f n-2
memo [n ]=f
Return f;
Here is no exponential time problem.2
1
2

(Wikipedia)
(MIT)

Why dynamic programming is efficient?

Fn

Figure 1: Recursion tree of Fibonacci number

According the picture, there is two times of Fn-3. If we use Naive Recursive algorithm
then need to calculate two times the same things. It is wastage of time. If we use Dynamic
Programming then dont need to calculate the same thing again because it is already in
memory, so just need to call the result. 2

- 2-

3 Application (Shortest path calculation)


For an application of Dynamic Programming we will calculate the shortest path from a
multi stage diagram. We will also describe the problem using the other algorithms.

1 2
2

6
5

3
7
1

4
3

3
4

11

10

12

2
8

11
8

11

Figure 2: Multi stage graph

According to the figure there are 5 stages and 12 nodes. We will calculate the shortest
path from node 1 to node 12 by using greedy algorithm, brute force and dynamic
programming. From node 1 to node 2 weight is 9, Node 1 to 3, 4, 5 weights are 7, 3, and 2
respectively. Node 1 stands on stage one, node 2,3,4,5 on stage two, node 6, 7, 8 on stage
three, node 9,10,11 on stage four and the last stage is node 12 on stage five.3

(Shortest path)

- 3-

3.1 Greedy Algorithm


Now we will solve the shortest path calculation problem according to the greedy
algorithm. For visit one stage to another stage we have to select the smallest edge.

1 2
2

6
5

9
4

3
7
1

4
3

7
4

11

2
8

12

11
5

10

11

Figure 3: shortest path according to the greedy algorithm

According to the greedy choice from stage 1 to stage 2 we have to select shortest edge.
From stage one to stage two the shortest path is 2 then stage two to three is 8, three to four is
5 and four to 5 is 2. The total weight is 17.

- 4-

1 2
2

6
5

3
7
1

4
3

3
4

11

10

12

2
8

11
8

11

Figure 4: Opposite of the greedy algorithm

According to the figure 4, shows totally opposite of the greedy algorithm. From stage
one to stage two it visited the biggest edge, the weight is 9. This is totally opposite the greedy
algorithm. Then stage two to stage three weight is 2 , stage three to four weight is 3 and
finally stage four to stage five weight is 2. Total weight is 16. It is opposite the greedy
algorithm but it shows the shortest path. So, greedy algorithm fails here.

- 5-

3.2 Brute Force


Now we will calculate the shortest path by the help of brute force. According to the
brute force we have to calculate the every possible combination.

6
7

7
8

10
9

10 10

11 9 10 9

10

10 11

10

10 11

Figure 5: All possible combination tree


According to the Figure 5, there are lot of calculation thats mean time wasting. On the
other hand here is overlapping problem. Same thing have to calculate again and again.

- 6-

3.3 Dynamic Programming


Now we will solve this problem by using dynamic programming.

1 2
2

6
5

3
7
1

4
3

3
4

11

10

12

2
8

11
8

11

Figure 6: Multistage graph.

Cost (i,j) = cost of edge i,j


Cost (i,j)= cost of shortest path from node j in stage i to the next stage
Cost (i,j)= min (cost (j,l) + cost (i+l,l))
We are going to solve this by backward method.
Cost of (3, 6) thats mean third stage and node 6.
Third stage
Cost (3, 6) = min(c (6, 9) +c (4,9)
c (6,10)+c (4,10)) = 7

Cost (3, 7) = min (c (7, 9) + c (4, 9)


c (7,10) +c (4,10)) = 5

Cost (3, 8) = min(c(8,10) +c(4,10)


c (8,11) + c(4,11) = 7

- 7-

7
9

6
3

5
10

12

4
8

11

7
5

Figure: Third stage minimum weight

Second Stage:
Cost (2, 2) = min (4+7, 2+5, 1+7) = 7
Cost (2, 3) = min (2+7, 7+5) = 9
Cost (2, 4) = min (11+7) = 18
Cost (2, 5) = min (11+5, 11+7) = 16

7
2
9

10

11

3
18

12

4
16
5
Figure: Second stage minimum path

- 8-

First Stage:
Cost (1, 1) = min (9+7, 7+9, 3+18, 2+16) = 16

7
2
9
9

10

11

7
1

18

12

4
2

16
5

Figure: Shortest path

The shortest path is 16. We solved this problem by backward method. There are no
overlapping and unnecessary delays. We found two shortest path in our multistage graph.
Every stage we calculated one time and saved the result in the memory. So, next time
when we need the result then we called it.

Conclusion:
Finally, we can say that dynamic programming is very effective to solve any problem.
To design anything we should always consider the time and cost. Dynamic programming
concept can give that solution.

- 9-

References
1.
2.
3.
4.

Wikipedia: http://en.wikipedia.org/wiki/Dynamic_programming#History
MIT:https://www.youtube.com/watch?v=OQ5jsbhAv_M
Shortest path: https://www.youtube.com/watch?v=m5Y-4TsXsJ0
https://www.youtube.com/watch?v=WN3Rb9wVYDY

- 10 -

Vous aimerez peut-être aussi