Vous êtes sur la page 1sur 41

Intro to Algorithm Complexity

Analysis

Copyright 2011 by Pearson Education, Inc. All rights reserved

Analysis of Algorithms
Formally measure how fast a program/algorithm runs
Efficiency of an algorithm can be measured in terms of:
- Execution time (time complexity)

* finding the total running time (commonly


estimated by counting functions in the
algorithm)
- The amount of memory required (space

complexity)

Time Complexity
Time complexity (T(n)) : A measure of the
amount of time required to execute an
algorithm
Factors that should not affect time complexity
analysis:
The programming language chosen to implement

the algorithm
The quality of the compiler
The speed of the computer on which the
algorithm is to be executed

Time Complexity Analysis (TCA)


Complexity of algorithm (T(n))
Objectives of time complexity analysis:
To determine the feasibility of an algorithm
To compare different algorithms before deciding on which

one to implement

Is a tool used to explain how algo behaves as input grows


larger.
Eg: The algo takes 1 second to run for input size 1000, how it
will behave if double the input size?
- Run as fast? Half as fast? Four times slower?

TCA - Motivation
If you make an algorithm for a web application that works
well for 1000 users and you measure its running time,
using algorithm complexity analysis, you can have a
good idea of what will happen once we get 2000 users
instead!
Conclusion - If you have measured your algo/programs
behavior for a small input, you can get a good idea of
how it will behave for larger inputs.

Time Complexity
Three possible states in algorithm analysis:
Worst-case
An upper bound on the running time for any input of given

size
A determination of the maximum amount of time that an
algorithm requires to solve problems of size n

Average-case
Assume all inputs of a given size are equally likely
A determination of the average amount of time that an

algorithm requires to solve problems of size n

Best-case
The lower bound on the running time
A determination of the minimum amount of time that an

algorithm requires to solve problems of size n

Time Complexity Example


Sequential search in a list of size n


n
Worst-case: Find or cannot find the target after compare every

element with the target value. (n comparisons)


Best-case: Find the target in the first place the element set. (1
comparison)
Average-case: Depends on the probability (p) that the target will
be found. (n/2 comparisons)

Big-O Notation
Time complexity of algorithm can be
represented by Big O notation. (O
order of)
Big O notation is denoted as O(f(n))
f(n)- algorithms growth-rate function /
instruction-counting function
Time complexity (Big-O)

Running time

O(1)

Constant

O(n)

Linear

O(n2)

Quadratic

O(log(n))

Logarithmic

f(n)
This function f(n) gives us the number of
instructions that would be needed by an
algorithm in the worst-case.
Example :
If an algorithm requires 6n+4 number of
instructions.. you may say that the
algorithm is defined to have f(n)= 6n+4

f(n)
Two terms found in the previous function :
f(n) = 6n + 4
bigger
term

In complexity analysis, we only care about


what happens to the function as program
input(n) grows large.
So..we will drop all terms that grow slowly
and only keep the ones that grow fast as

f(n)
Two terms found in the previous function :
f(n) = 6n + 4
dominant
term

First, drop the constant 4 f(n) = 6n


Next, drop the constant multiplier in front
of n f(n) = n (done! :)
Therefore, asymptotic behavior of
f(n) = 6n+4 is described by f(n) = n
And.. the above algorithm is a lineartime algorithm O(n)

Big-O Analysis in General


With independent nested loops:
The number of iterations of the inner loop is

independent of the number of iterations of the


outer loop

Outer loop executes n/2 times.


For each of those times, inner
loop executes n2 times, so the
body of the inner loop is
executed (n/2)*n2= n3/2 times.
The algorithm is O(n3)

Big-O Analysis in General


With dependent nested loops:
Number of iterations of the inner loop

depends on a value from the outer loop


When j is 1, inner loop executes 3
times; when j is 2, inner loop
executes 3*2 times; when j is n,
inner loop executes 3*n times.
In all the inner loop executes
=3+6+9++3n
=3(1+2+3++n)
=3n2/2 + 3n/2 times.
The algorithm is O(n2).

Summation Series Formula


Sum of value from 1 to n is given by the
formula:
n

i=1

= n(n+1)/2

Sum of First n Natural Numbers


Write down the terms of the sum in
forward and reverse orders; there are n
terms:

Big-O Analysis in General


Three algorithms for computing the sum 1 + 2 + . . . + n for an
integer n > 0

Algo_A:
T(n) = n
= O(n)
i

1+2

1+2+3

1+2+3++n

Algo_B
When i is 1, inner loop executes 1 time,
when i is 2, inner loop executes 2 times;
when i is n, inner loop executes n
times.
In all the inner loop executes
=1+2+3++n = n2/2 + n/2 times.
The algorithm is O(n2).

Algo_C
Algorithms whose
solutions are
independent of the
size of the problems
inputs are said to
have constant time
complexity : O(1)

Time complexity (T(n)) : measure the amount of time required to execute an


algorithm

Big-O Notation Examples


All these expressions are O(n):
n, 3n, 61n + 5, 22n 5,

All these expressions are O(n2):


n2,

9 n2, 18 n2+ 4n 53,

All these expressions are O(n log n):


n(log n),

5n(log 99n), 18 + (4n 2)(log (5n + 3)),

Growth Rates
A growth function shows the relationship between the size of
the problem (n) and the time it takes to solve the problem

worstTime(n)

O(2n)

O(n2)

O(n log n)

O(n)

O(log n)
O(1)
n

Constant :

Run in constant time.

The algorithm requires the same fixed number of steps regardless of the size of the task.

Big-O Analysis in General


The complexity of an algorithm is determined
by the complexity of the most frequently
executed statements. If one set of
statements have a running time of O(n3) and
the rest are O(n), then the complexity of the
algorithm is O(n3).
Algorithms whose solutions are independent
of the size of the problems inputs are said to
have constant time complexity
All basic statements (assignments, reads, writes,

conditional testing, library calls) run in constant time


Constant time complexity is denoted as O(1)

Array Based Implementation


Adding an entry to a bag, an O(1) method,

Array Based Implementation


Searching for an entry, O(1) best case,
O(n) worst or average case
Thus an O(n) method overall

A Linked Implementation
Adding an entry to a bag, an O(1) method,

A Linked Implementation
Searching a bag for a given entry, O(1) best
case, O(n) worst case , Thus an O(n)
method overall

Sorting Algorithms and their


Complexity Analysis

Objectives
Sort array into ascending order using
Insertion sort
Selection sort
merge sort

Assess efficiency of a sort, discuss


relative efficiencies of various methods

Insertion Sort
Arranging things into either ascending or
descending order is called sorting
When book found taller than one to the right
Remove book to right
Slide taller book to right
Insert shorter book into that spot

Compare shorter book just moved to left


Make exchange if needed

Continue

Insertion Sort

The placement of the third book


during an insertion sort

Insertion Sort

An insertion sort of books

Copyright 2012 by Pearson Education, Inc. All rights reserved

A insertion sort of an array of integers into ascending order

Analysis: Insertion Sort


Algorithm
for k = ( 1 to n-1)
// index 0,nothing to compare
temp = a[k];
i = k;
while (i>0)&& (a[i-1]>temp) //shifting
a[i] = a[i-1];
i--;
end while
a[i+1]=temp;
//store
end for

in the worst case,


for each i we do (i
-1) shift inside the
inner for-loop

Selection Sort
Example of sorting books by height
Take all books off shelf
Select shortest , replace on shelf
Continue until all books

Alternative
Look down shelf, select shortest
Swap first with selected shortest
Move to second slot, repeat process

Selection Sort

Before and after exchanging the shortest book and the first book

Assume we have an unsorted collection of n elements in an array or list


called container;

A selection sort of an array of integers into ascending order

After n-1repetitions of this process, the last item has automatically fallen into place

Merge Sort
Divide array into two halves
Sort the two parts
Merge them into one sorted array

Uses strategy of divide and conquer


Divide problem up into two or more distinct,

smaller tasks

Merge Sort Algorithm

The effect of the recursive calls and the merges during a merge sort:
- Recursively sort 1st half of input array
- Recursively sort 2nd half of the input array
-Merge two sorted sub-lists into one

Efficiency of Sorting Algorithms


Sorting

Best Case

Worst Case

Insertion sort

O(n)

O(n2)

Selection sort

O(n2)

O(n2)

n Log2 n

n Log2 n

Merge sort

The time efficiencies of three sorting algorithms,


expressed in Big-O notation

Vous aimerez peut-être aussi