Vous êtes sur la page 1sur 37

Lower and Upper Bound Theory

Prof:Dr. Adnan YAZICI


Dept. of Computer Engineering
Middle East Technical Univ.
Ankara - TURKEY

1
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower and Upper Bound Theory
How fast can we sort?
Lower-Bound Theory can help us answer this
question.
• Most of the sorting algorithms are comparison
sorts: only use comparisons to determine the
relative order of elements.
Examples: insertion sort, merge sort, quicksort,
heapsort.
• The best worst-case running time that we’ve
seen for comparison sorting is O(nlgn) .
Is O(nlgn) the best we can do? 2
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower and Upper Bound Theory
• Lower Bound, L(n), is a property of the specific
problem, i.e. sorting problem, MST, matrix
multiplication, not of any particular algorithm
solving that problem.
• Lower bound theory says that no algorithm can
do the job in fewer than L(n) time units for
arbitrary inputs.
• For example, every comparison-based sorting
algorithm must take at least L(n), nlgn, time in
the worst case.
• L(n) is the minimum over all possible
algorithms, of the maximum complexity.
3
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower and Upper Bound Theory
• Upper bound theory says that for any arbitrary inputs,
we can always sort in time at most U(n). How long it
would take to solve a problem using one of the known
algorithms with worst-case input gives us a upper
bound.
• U(n) is the minimum over all known algorithms, of the
maximum complexity.
• For example: Finding the maximum element in an
unsorted array is U(n), O(n).
• Both upper and lower bounds are minima over the
maximum complexity of inputs of size n.
• Improving an upper bound means finding a new
algorithm with better worst-case performance.
• The ultimate goal is to make these two functions
coincide. When this is done, the optimal algorithm will
have L(n) = U(n). 4
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower Bounds
Examples:
• an estimate on a minimum amount needed to
multiply two n-by-n matrices of work needed to
solve a given problem.
• number of comparisons needed to find the mean
element in a set of n numbers.
• number of comparisons needed to sort an array of
size n.
• number of comparisons necessary for searching in a
sorted array.
• number of multiplications for n-digit integers.
5
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower Bounds (cont.)
• Lower bound can be
– an exact count
– an efficiency class (Ω)

• Tight lower bound: there exists an algorithm


with the same efficiency as the lower bound

Problem Lower bound Tightness


sorting Ω(nlogn) yes
searching in a sorted array Ω(logn) yes
element uniqueness Ω(nlogn) yes
n-digit integer multiplication Ω(n) unknown
multiplication of n-by-n matrices Ω(n2) unknown
6
A.Yazıcı Lower&Upper Bound, Fall 2015
Methods for Establishing Lower Bounds
There are few techniques for finding lower bounds:

• trivial lower bound

• information-theory

• decision tree

• problem reduction

• adversary arguments
7
A.Yazıcı Lower&Upper Bound, Fall 2015
Trivial Lower Bound Method

1) Trivial Lower Bounds: For many problems it is possible to


easily observe that a lower bound identical to number of inputs
(n) exists, (or possibly outputs) to the problem.

• The method is based on counting the number of items that must


be processed in input and generated as output.

• Note that any algorithm must, at least, read its inputs and write
its outputs.

Example-1: Multiplication of a pair of nxn matrices


• requires that 2n2 inputs be examined and
• n2 outputs be computed, and
• the lower bound for this matrix multiplication problem is
therefore Ω(n2).
8
A.Yazıcı Lower&Upper Bound, Fall 2015
Trivial Lower Bound Method
Example: Finding maximum (or minimum) of
unordered array requires examining each input,
so it is Ω(n).

• A simple counting arguments shows that any


comparison-based algorithm for finding the maximum
value of an element in a list of size n must perform at
least n-1 comparisons for any input.

Comment:
• This method may and may not be useful

• be careful in deciding how many elements must be


processed
9
A.Yazıcı Lower&Upper Bound, Fall 2015
Information Theory
• The information theory method establishing
lower bounds by computing the limitations on
information gained by a basic operation and then
showing how much information is required
before a given problem is solved.
• This is used to show that any possible algorithm
for solving a problem must do some minimal
amount of work.
• The most useful principle of this kind is that the
outcome of a comparison between two items
yields one bit of information.
10
A.Yazıcı Lower&Upper Bound, Fall 2015
Lower and Upper Bound Theory
Example: For the problem of searching an
ordered list with n elements for the position
of a particular item,
<a,b,c,d>


>

<a,b> <c,d>

≤ > ≤ >

a b c d

For any outcome, at least 2 comparisions, therefore,


2 bits of information are necessary.
11
A.Yazıcı Lower&Upper Bound, Fall 2015
Information Theory
Example: The lower bound of the problem of
searching an ordered list with n elements for
the position of a particular item is Θ(lgn).
Proof: There are n possible outcomes, input
strings
– In this case lgn comparisons are necessary,
– If a comparison between two items yields one bit
of information, then unique identification of an
index in the list requires lgn bits.
– Therefore, lgn bits of information are necessary
to specify one of the n possibilities.
12
A.Yazıcı Lower&Upper Bound, Fall 2015
Information Theory
Example-2: By using information theory, the lower bound
for the problem of comparison-based sorting problem.
input

≤ >

≤ > ≤ >

<1,2,3>
<2,1,3>
≤ >
≤ >
111

<1,3,2> <3,2,1> 000


<3,1,2> <2,3,1>

Figure: There are 3! = 6 possible permutations of the n


input elements, so lgn! bits are required for lgn!
comparisons for sorting n things.
13
A.Yazıcı Lower&Upper Bound, Fall 2015
Information Theory
Example: By using information theory, the lower bound
for the problem of comparison-based sorting problem is
Θ(nlgn)
Proof:
– If we only know that the input is orderable, then
there are n! possible outcomes – each of the n!
permutations of n things.
– Within the comparison-swap model, we can only use
comparisons to derive information
– Since each comparison only yields one bit of
information, the question is what the minimum
number of bits of information needed to allow n!
different outcomes is, which is lgn! bits.
14
A.Yazıcı Lower&Upper Bound, Fall 2015
Information Theory
• How fast lgn! grow?
• We can bind n! from above by overestimating every
term of the product, and bind it below by
underestimating the first n/2 terms.

n/2*n/2*…*n/2*…*2*1 ≤ n*(n-1)*…*2*1 ≤ n* n*…*n


(n/2)n/2 ≤ n! ≤ nn
½(nlgn-n) ≤ lgn! ≤ nlgn
½nlgn- ½ n ≤ lgn! ≤ nlgn
½nlgn ≤ lgn! ≤ nlgn

• This follows that lgn!∈ Θ(nlgn)


15
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model

• This method can model the execution of any


comparison based problem.
• One tree for each input size n.
• View the algorithm as splitting whenever it
compares two elements.
• The decision tree contains the comparisons along
all possible instruction traces.
• The running time of the algorithm = the length of
the path taken.

16
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model
• Decision tree is a convenient model of algorithms
involving comparisons in which:
• Internal nodes represent comparisons
• leaves represent outcomes.
• There must be n internal nodes in all of these trees
corresponding to the n possible successful
occurrences of x in A.
• Because every leaf in a valid decision tree must be
reachable, the worst-case number of comparisons
done by such a tree is the number of nodes in the
longest path from the root to a leaf in the decision
tree consisting of the comparison nodes.
• Worst-case running time = the length of the longest
path of tree = the height of tree.
17
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree Model for Searching

Example: Comparison-based searching


problem
Proof: Let us consider all possible decision
trees which model algorithms to solve
the searching problem.
– FIND(n) is bounded below by the
distance of the longest path from the root
to a leaf in such a tree.

18
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model for Searching
x:A(lg(n+1)/2)
Height=log2n

>
x:A(lg(n+1)/4) x:A(lg(3n+1)/4)

≤ >
> ≤ ….

x:A(1) …. x:A(lg(n+1)/4-1) x:A(lg(n+1)/4 +1) … x:A(n)

≤ ≤ ≤
> > ≤ > >
Return Failure Return Failure Return Failure Return Failure

A decision tree for a search algorithm

19
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree Model for Search on Sorted Elements

3
< >
=
1 3 5 Height=log2(7)=3
< > <
= >
=
0 1 2 4 5 6
<
= > <
=
> <
=
> < >
=
F 0 F F 2 F F 4 F F 6 F

Example decision tree for search on sorted list


<0,1,2,3,4,5,6>

20
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree Model for Searching
Example: Comparison-based searching problem
• A decision tree, with n leaves and height h, n ≤ 2h.
Indeed, a decision tree of height h with largest
number of leaves has all its leaves on the last
level. Hence, the largest number of leaves in such
a tree is 2h. FIND(n) = h ≥ log (n)
Since, n ≤ 2h and.
Such a longest path is Θ(lgn).
• Therefore, the lower bound for comparison-
based searching on ordered input is Θ(lgn).
21
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model for Searching
x:A(1)
= ≠ Height = n
Return(A(1)) x:A(2)

=
Return(A(2))
x:A(n)
= ≠
Return(A(n)) Return(A(0))

A decision tree for search on unsorted list

The number of comparisons in the worst-case is equal to the


height (h) of the algorithm’s decision tree. h ≥ n and h ∈ Ω(n).

22
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree for Search on Unsorted Elements
4
= ≠
4 1
= ≠
1 5
= ≠
5 3
= ≠
Height = 5
3
=
2

2 0
=
0 F

Example decision tree for search on unsorted


list <4,1,5,3,2,0>, that is, h = n.
• Therefore, the lower bound for comparison-
based searching on unordered input is Θ(n).
23
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model for Sorting
a1,a2,a3
yes no
a1<a2

a1a2a3 a2a1a3
yes no yes no
a2<a3 a1<a3

a1a3a2 a2a3a1
a1<a2<a3 a2<a1<a3
yes no yes no
a1<a3 a2<a3

a1<a3<a2 a3<a1<a2 a2<a3<a1 a3<a2<a1

• Each internal node is labelled for i,j element-of {1,2,...,n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.

24
A.Yazıcı Lower&Upper Bound, Fall 2015
Decision-tree model for sorting

Each permutation represents a distinct sorted order and the tree


muct have n! leaves. Since a binary tree of height h has no more
than 2h leaves, we have n! ≤ 2h.
25
A.Yazıcı Lower&Upper Bound, Fall 2015
26
A.Yazıcı Lower&Upper Bound, Fall 2015
Problem Reduction
• Prob. 1 (to be solved) (reduction) Prob 2 (solvable by Alg.A)
• Prob 2 (solvable by Alg.A) (Alg.A) Soln. to Prob. 1
• Another elegant means of proving a lower bound on a
problem P is to show that problem P is at least as hard
as another problem Q with a known lower bound.
• We need to reduce Q to P (not P to Q).
• In other words, we should show that an arbitrary
instance of problem Q can be transformed (in a
reasonably efficient way) to an instance of problem P,
so that any algorithm solving P would also solve Q as
well.
• Then a lower bound for Q will be a lower bound for P.
• Prob. Q (LB is known) (reduction) Prob.P (whose LB to be known)
27
A.Yazıcı Lower&Upper Bound, Fall 2015
Problem Reduction
Example: Euclidean minimum spanning tree (MST) problem:
Given n points in the Cartesian plane construct an Euclidian MST,
whose vertices are the given points. As a problem with a known LB,
we use the element uniqueness problem (EUP), whose LB is known
as Ω(nlgn). The reduction is as follows:
• Transform any set x1,x2,…,xn of n real numbers into a set of
points in the Cartesian plane by simply adding 0 as the points’s y
coordinate: (x1,0), (x2,0),…,(xn,0).
• This reduction implies that Ω(nlgn) is a LB for the Euclidian
MST problem as well.
• Solution of EUP can now be found using the solution of
Euclidean MST: Let T be any Euclidean MST found for this set of
points, (x1,0), (x2,0),…,(xn,0).
• Since T must contain a shortest edge, checking whether T
contains a zero length edge will answer the question about
uniqueness of the given numbers, x1,x2,…,xn. 28
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction: Convex Hull

• Example: Computing the convex hull in


Computational Geometry is perhaps the most
basic, elementary function on a set of points.
• A Convex Hull is the smallest convex polygon
that contains every point of the set S.
– A polygon P is convex if and only if, for any
two points A and B, inside the polygon, then
the line segment AB is inside P.

29
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction Convex Hull (cont.)

• One way to visualize a convex hull is to put


a "rubber band" around all the points, and
let it wrap as tight as it can.

• We can think of a rubber


balloon for the three-
dimensional case.

30
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction Convex Hull (cont.)
Theorem: The comparison-based sorting problem is
transformable (in linear time) to the convex hull
problem, therefore; the problem of finding the
ordered convex hull of n points in the plane requires
Ω(nlogn) time. That is, LB of the problem of
convex hull is also Ω(nlogn).

Proof: First we introduce a proposition:


Proposition (Lower Bounds via Transformability):
If a problem A is known to require T(n) time and A
is τ(n)- transformable to B (A ∝τ(n)B) then B
requires at least T(n) – O(τ(n)) time.
31
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction Convex Hull (cont.)
• Reduction: Given n real sorted numbers x1, x2, ..., xn, all
positive, we determine corresponding points (x1,x12),
(x2,x22),…, (xn,xn2).
• All of these points lie on the parabola y = x2 and they all
together result in a convex hull.
• Since LB of comparison-based sorting problem is known,
Ω(nlogn), the LB of the problem of convex hull is also
Ω(nlogn).
• So, when n real numbers x1, x2, ..., xn are given, we can
show how a convex hull algorithm can be used to sort these
numbers with only linear overhead.
• The convex hull of this set, in standard form, will consist of
a list of the points sorted.
• One pass through the list will enable us to read off the xi in
order (in sorted form).
32
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction Convex Hull (cont.)

y y = x2

*
*
*
*
*
*
x
x1 x2 x3 x4 x5 x6
Sorted  
33
A.Yazıcı Lower&Upper Bound, Fall 2015
Example for reduction Convex Hull (cont.)

Proposition (Upper Bounds via Transformability): If a


problem B can be solved in T(n) time and problem A is
τ(n)- transformable to B (A ∝τ(n)B) then A can be
solved in at most T(n) + O(τ(n)) time.

34
A.Yazıcı Lower&Upper Bound, Fall 2015
Oracles and Adversary Arguments
• Another technique for obtaining LBs consists of making use of an oracle or
adversary arguments.
• Adversary arguments establish (worst-case) lower bounds by dynamically
creating inputs to a given algorithm that force the algorithm to perform
many basic operations.
• In order to derive a good LB, the adversary tries its best to cause the
algorithm to work as hard as it might.
• We only require that the adversary gives answers that are consistent and
truthful.
• We also assume that the algorithm uses a basic operation when querying the
adversary.
• Clever adversaries force any given algorithm to make many queries before
the algorithm has enough information to output the solution.
• LBs for the worst-case complexity are then obtained by counting the
number of basic operations that the adversary forces the algorithm to
perform.

35
A.Yazıcı Lower&Upper Bound, Fall 2015
Oracle and Adversary Arguments
• Adversary argument: a method of proving a lower bound by playing role of
adversary that makes algorithm work the hardest by adjusting input.
• LBs for the worst-case complexity are then obtained by counting the
number of basic operations that the adversary forces the algorithm to
perform.
Example-1: “Guessing” a number between 1 and n with yes/no
questions.
Adversary:Put the number in a larger of the two subsets
generated by last question.
For example; Adversary (or Oracle) knows your guess, which in the order
of <4,7,9,4,8>. So, you put the the numbers in that order, 8 at the end, to
force the algorithm the hardest way. That way, output is guessed as 8 only
after n-1 guesses. That means the lower bound for this problem is Ω(n).

Example-2: Merging two sorted lists of size n


a1 < a2 < … < an and b1 < b2 < … < bn
• Adversary: ai < bj iff i < j.
• There output: b1 < a1 < b2 < a2 < … < bn < an requires 2n-1 comparisons
of adjacent elements. That means the lower bound for this problem is Ω(n).36
A.Yazıcı Lower&Upper Bound, Fall 2015
Oracles and Adversary Arguments
Example: (Merging Problem) Given the sets A(1:m) and B(1:n), where the items
in A and in B are sorted. Investigate lower bounds for algorithms merging these
two sets to give a single sorted set.
• Assume that all of the m+n elements are distinct and A(1)<A(2)< …<A(m)
and B(1) < B(2) < …< B(n).
• Elementary combinatorics tells us that there are C((m+n), n)) ways that the
A’s and B’s may merge together while still preserving the ordering within A
and B.
• Thus, if we use comparison trees as our model for merging algorithms, then
there will be C((m+n), n)) external nodes and therefore log C((m+n), m)
comparisons are required by any comparison-based merging algorithm in
worst-case.
• If we let MERGE(m,n) be the minimum number of comparisons need to
merge m items with n items then we have the inequality
log C((m+n), m) ≤ MERGE(m,n) ≤ m+n-1.
• The upper bound and lower bound can get arbitrarily far apart as m gets
much smaller than n.
37
A.Yazıcı Lower&Upper Bound, Fall 2015

Vous aimerez peut-être aussi