Académique Documents
Professionnel Documents
Culture Documents
6.046J
Lecture 1
Prof. Shafi Goldwasser
Prof. Erik Demaine
Welcome to Introduction to
Algorithms, Spring 2004
Handouts
1.
2.
3.
4.
Course Information
Course Calendar
Problem Set 1
Akra-Bazzi Handout
L1.2
Course information
1.
2.
3.
4.
5.
6.
Staff
Prerequisites
Lectures & Recitations
Handouts
Textbook (CLRS)
Website
8. Extra Help
9. Registration
10.Problem sets
11.Describing algorithms
12.Grading policy
13.Collaboration policy
L1.3
L1.5
Insertion sort
pseudocode
INSERTION-SORT (A, n)
A[1 . . n]
for j 2 to n
do key A[ j]
ij1
while i > 0 and A[i] > key
do A[i+1] A[i]
ii1
A[i+1] = key
j
A:
sorted
key
L1.8
L1.9
L1.10
L1.11
L1.12
L1.13
L1.14
L1.15
L1.16
L1.17
L1.18
9 done
L1.19
Running time
The running time depends on the input: an
already sorted sequence is easier to sort.
Major Simplifying Convention:
Parameterize the running time by the size of
the input, since short sequences are easier to
sort than long ones.
TA(n) = time of A on length n inputs
Generally, we seek upper bounds on the
running time, to have a guarantee of
performance.
L1.20
Kinds of analyses
Worst-case: (usually)
T(n) = maximum time of algorithm
on any input of size n.
Average-case: (sometimes)
T(n) = expected time of algorithm
over all inputs of size n.
Need assumption of statistical
distribution of inputs.
Best-case: (NEVER)
Cheat with a slow algorithm that
works fast on some input.
L1.21
Machine-independent time
What is insertion sorts worst-case time?
BIG IDEAS:
Ignore machine dependent constants,
otherwise impossible to verify and to compare algorithms
Look at growth of T(n) as n .
Asymptotic Analysis
L1.22
-notation
DEF:
Basic manipulations:
Drop low-order terms; ignore leading constants.
Example: 3n3 + 90n2 5n + 6046 = (n3)
L1.23
Asymptotic performance
When n gets large enough, a (n2) algorithm
always beats a (n3) algorithm.
.
T(n)
n0
Asymptotic analysis is a
useful tool to help to
structure our thinking
toward better algorithm
We shouldnt ignore
asymptotically
slower algorithms,
however.
Real-world design
situations often call for a
L1.24
(
j
)
[arithmetic series]
j 2
( j / 2) n 2
j 2
Example 2: Integer
Multiplication
Let X = A B and Y = C D where A,B,C
and D are n/2 bit integers
Simple Method: XY = (2n/2A+B)(2n/2C+D)
Running Time Recurrence
T(n) < 4T(n/2) + 100n
Solution T(n) = (n2)
L1.26
L1.28
L1.29
1
1
L1.30
20 12
13 11
13 11
L1.31
20 12
13 11
13 11
L1.32
20 12
20 12
13 11
13 11
13 11
L1.33
20 12
20 12
13 11
13 11
13 11
L1.34
20 12
20 12
20 12
13 11
13 11
13 11
13 11
L1.35
20 12
20 12
20 12
13 11
13 11
13 11
13 11
L1.36
20 12
20 12
20 12
20 12
13 11
13 11
13 11
13 11
13 11
L1.37
20 12
20 12
20 12
20 12
13 11
13 11
13 11
13 11
13 11
11
L1.38
20 12
20 12
20 12
20 12
20 12
13 11
13 11
13 11
13 11
13 11
13
11
L1.39
20 12
20 12
20 12
20 12
20 12
13 11
13 11
13 11
13 11
13 11
13
11
12
L1.40
20 12
20 12
20 12
20 12
20 12
13 11
13 11
13 11
13 11
13 11
13
11
12
(1) if n = 1;
2T(n/2) + (n) if n > 1.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
L1.44
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
L1.45
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/2)
T(n/2)
L1.46
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2
T(n/4)
T(n/4)
cn/2
T(n/4)
T(n/4)
L1.47
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2
cn/2
cn/4
cn/4
cn/4
cn/4
(1)
L1.48
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2
cn/2
cn/4
cn/4
cn/4
h = lg n cn/4
(1)
L1.49
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn
cn/2
cn/2
cn/4
cn/4
cn/4
h = lg n cn/4
(1)
L1.50
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn
cn/2
cn/2
cn/4
cn/4
cn/4
h = lg n cn/4
cn
(1)
L1.51
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn
h = lg n cn/4
cn/4
cn/4
cn
cn/4
cn
cn/2
cn/2
(1)
L1.52
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn
cn/2
cn/2
cn/4
cn/4
(1)
cn/4
cn
h = lg n cn/4
cn
#leaves = n
(n)
L1.53
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn
cn/2
cn/2
cn/4
cn/4
(1)
cn/4
cn
h = lg n cn/4
cn
#leaves = n
(n)
Total(n lg n)
L1.54
Conclusions
(n lg n) grows more slowly than (n2).
Therefore, merge sort asymptotically
beats insertion sort in the worst case.
In practice, merge sort beats insertion
sort for n > 30 or so.
L1.55