Vous êtes sur la page 1sur 6

Welcome to Calculus.

I'm professor Greist.


We're about to begin lecture 9 on Orders
of Growth.
In this lesson, we'll begin a study of
asymptotics.
What happens when a function gets very,
very small or very, very large?
How fast does a function go to zero?
How quickly to infinity?
Quantifying these rates of change will
give us a new language, that of the big O
or asymptotic notation.
>> Let's begin this lesson with a few
examples of functions, that grow to
infinity but begin at different rates.
Consider, for example, a monomial x to
the n.
For n, a positive integer, and the
exponential function, e to the x.
Both become infinite as x approaches
infinity, but at what rate?
Well, if we use L'Hospital's Rule, then,
what we get when the derivative of the
numerator, we have n times x to the n
minus 1.
The denominator, of course, remains fixed
at e to the x.
Now, if we continue doing this, applying
L'Hospital's Rule, then eventually, since
n is a positive integer, we will get
something, times x to the 0.
That something is, as you can check, n
factorial.
Whatever it is, it's a constant.
Yet, the denominator remains e to the x.
What is this limit?
It's 0.
What does that mean?
It means that the exponential function
grows faster than any monomial, and
indeed, than any polynomial function.
Likewise, if we consider, say, the
natural logarithm of x, and let's raise
it to the nth power and compare it with
linear growth with the monomial x.
Then, what happens?
Well, if we apply L'Hospital's Rule, we
get in the numerator n times the log of x
to the n minus first power, times the
derivative of log of x, which is 1 over
x.
The denominator, well, the derivative of
x is simply 1.
So we are left again with an
indeterminate limit.
However, inducting eventually what are we
going to get?
We'll get a constant in the numerator.

That constant being once again n


factorial.
The denominator remaining x.
This too, goes to zero, which means that
linear growth is faster than logarithmic
growth, even logarithmic growth raised to
a power.
Now, put these together, this means that
exponential growth beats polynomial
growth beats logarithmic or even
polylogarithmic growth.
What else is there?
Well there's an entire world or hierarchy
of functional growth as x goes to
infinity.
Beginning to the constant functions that
don't grow at all, moving up to
logarithmic growth, to linear growth,
quadratic growth, cubic growth, all the
way to exponential growth.
One question remains.
Is there anything beyond that?
What can grow faster than exponential
growth?
Well there are indeed worlds beyond that.
One simple example being factorial
growth.
For now, what we want to do is consider
the monomials, x to the n, for n bigger
than 0, and look at their growth.
Both, as x goes to infinity, and as x
goes to 0.
Well, it's not exactly growing there.
It's shrinking.
But this is extremely interesting,
because if you look at the hierarchy of
functions, as x is going to zero, we see
that linear is bigger than quadratic, is
bigger than cubic, is bigger than
quartic, et cetera.
That is, x to the n plus 1 is less than x
to the n, as x is going to 0.
That is in contradistinction to what
happens as x goes to infinity, where we
see that linear growth, is much slower
than quadratic growth, is much slower
than cubic, et cetera.
That is, x to the n plus 1 is bigger than
x to the n, as x is getting larger and
larger.
All right.
What I'm going to do with that, well,
let's take a look at an example.
The limit of 1 minus 2x plus 3x squared
over 4 minus 5x plus 6x squared.
Let's first consider the limit as x goes
to zero, what does that become?
Well this one's really simple.
As x is going to 0, all of the higher
order terms in x are small, negligible,

which means that we simply evaluate this


limit and get one fourth.
The ratio of the lowest order terms in x.
On the other hand, if we look at this
same limit as x goes to infinity, we
can't do the same thing.
what we have to do is switch things
around a little bit.
Factor out the highest order power of x.
In this case, x squared.
Cancel that.
And what we see left over is 3 minus
something small over 6 minus something
small giving a value of one half and the
limit.
Now, if we compare these 2, we see that
as x is getting small, it's the lowest
order terms that dominate.
Whereas, if x is going to infinity, it's
the highest order terms that dominate.
This duality is extremely important and
can cause confusion, so always pay
attention to which way the limit is
going.
What we really need is a good language
for controlling and describing this sort
of growth.
This is going to be called big O and it
has two forms.
One as x goes to infinity, the other as x
is going to 0.
This is an extremely important and subtle
definition.
You're going to want to pay attention to
this.
We say that a function f is in big O of x
to the n if, in absolute value, f of x is
less than some constant times x to the n.
Now, we're not saying exactly what that
constant is, it's just some constant.
And we're just getting an upper bound, we
don't actually care about the value of c,
we just want to control the growth.
Now, there are two versions of this.
One, we can say that this inequality has
to hold as x goes to zero, in which case,
big O of x to the n consists of all
functions that approach zero, at least as
fast as x to the n.
On the other hand, if we look at growth
as x goes to infinity, then big O of x to
the n consists of those functions that
approach infinity no faster than x to the
n.
Both cases these are an upper bound.
This language replaces the higher order
of terms that we have used in the past.
For example, if we look at the expansion
of arctan of x.
What is that?

The first term in the Taylor series is x.


All of the other terms are of higher
order in x, specifically, we could say,
big O of x squared.
Or, more specifically, we could say big O
of x cubed.
That is a stronger statement.
It allows us to be more precise.
Or, if we want to expand further, we
could say that arctan of x is x minus x
cubed over 3 plus big O of x to the 5th.
Likewise, as x goes to infinity, we could
look at a function like x times square
root of 5 plus 3x plus x squared.
This is a complicated looking function.
But really it's just x squared plus some
other stuff in the limit as x goes to
infinity.
That other stuff is itself going to
infinity, but only at a linear rate.
It is in big O of x.
If we wanted to be more specific, we
could use the binomial series, to say
that this function is x squared plus 3
halves x plus something in big O of 1.
That is, something bounded, something in
big O of x to the zero.
More generally, instead of talking about
big O of x to the n, we could talk about
big O of some other function, g of x, if
this same inequality holds for some
constant C.
Let's look at a few examples.
In the limit, as x goes to zero, the
function 5x plus 3x squared is in big O
of x, but it's not in big O of x squared.
Sine of x is in big O of x, but not in
big O of x squared.
Log of 1 plus x minus x, as you could see
from Taylor expanding, is in big O of x
squared, but not in big O of x cubed.
1 minus of cosine of x is in big O of x
to the fourth, but not in big O of x to
the fifth.
Square root of x is not in big O of x to
the n for any n bigger than 1.
It does not go to zero very quickly at
all.
On the other hand, e to the minus 1 over
x squared is very quickly going to 0,
it's in big O of x to the n for all
positive n.
As x is going to infinity, we have a
number of interesting examples as well.
Arctan of x is in big O of 1.
It's bounded.
That means it's also in big O of x to the
n for any positive n.
x times square root of 1 plus x squared
grows quadratically most, but it is not

in big O of x to the three halves.


The log of the hyperbolic sine of x,
well, is really in big O of x.
It's got linear growth at most.
It's not in big O of log of x.
It grows too fast for that.
Hyperbolic cosine of x is in big O of e
to the x.
It has exponential growth, but not
polynomial growth.
Log of x to the 5th can be simplified,
and we see that it's really in big O of
log x, as well as big O of x to the n,
for any positive n.
Lastly, x to the x is not in big O of
constant to the x for any constant.
And it's not in big O of x factorial.
This is even super factorial growth.
Now there is a huge subject of asymptotic
analysis associated with big O.
It has certain rules.
The ones that you need to know for this
course are the following.
If you take something and big O of x to
the m and multiply it by something in big
oh of x to the n, you're going to get
something in big O of x to the m plus n.
Just like multiplying monomials together.
This is true whether you're going to
zero, or infinity.
On the other hand, if you add two
functions together.
Something that has x to the m growth, and
something that has x to the n growth,
then there's a difference.
As x goes to 0, your resulting function
is in the minimum of m and n.
As x goes to infinity, you're looking at
big O of x to the maximum of m and n.
Let's look at this in the context of an
example, as x goes to zero.
Consider e to the x over 1 minus x.
Well, I'm going to write that as the
product of two functions.
e to the x is 1 plus x, plus big O of x
squared.
Using the geometric series, we see that 1
over minus x is also 1 plus x plus big O
of X squared.
Now how do we simplify this?
We pretend that everything in sight is
just polynomials and multiply them
together, even the big O terms.
The first term is 1 plus X quantity
squared.
The next term is 1 plus x times big O of
x squared.
And, of course, we have two copies of
that, so put a 2 out in front.
And our last term is big O of x squared,

squared.
Alright, now, how do we deal with
multiplying by big O.
Well, collect together the first term,
and then notice that that x squared is in
big O of x squared.
So we can forget about it.
Next, we have 2 times 1 plus x times big
O of x squared.
Multiplying that out to 2 times big O of
x squared plus x times big O of x squared
gives us what?
Well, we don't care about constants with
big O, so forget about that.
And x times big O of x squared is big O
of x cubed.
Lastly, big O of x squared, squared, is
big O of x to the 4th.
Now as x goes to 0, those higher order
terms all fall under big O of x squared,
and we get a simplification.
If we want to make it more interesting,
say, raising everything to the power
alpha, then we could use the binomial
series, to say, that this is 1 plus 2
alpha x plus, well, there's some other
terms, but they're all big O of x
squared.
This lesson on asymptotics dealt with how
to deal with the growth rates of change
of a function.
And though we're at the end of Chapter 1,
this provides a perfect segue into
Chapter 2, where we'll codify what we
know about rates of change into the
notion of a derivative.

Vous aimerez peut-être aussi