Vous êtes sur la page 1sur 65

Chapter 2

Random Variables
Random Variables
A random variable is a convenient way to express the elements
of as numbers rather than abstract elements of sets.

Definition: Let A
X
; A
Y
be nonempty families of subsets of X and
Y, respectively. A function f: X Y is (A
X
;A
Y
)-measurable if f
-1

(A) A
X
for all A A
Y
.

Definition: Random Variable
A random variable X is a measurable function from the probability
space (, ,P) into the probability space (, A
X
, P
X
), where in
R is the range of X (which is a subset of the real line) A
X
is a
Borel field of X, and P
X
is the probability measure on induced
by X.
Specifically, X: .
Remarks:
- A random variable X is a function.
- It is a numerical quantity whose value is determined by a random
experiment.
- It takes single elements in and maps them to single points in R.
- P is the probability measure over the sample space and P
X
is the
probability measure over the range of the random variable.
- The induced measure P
X
is just a way of relating measure on the
real line --the range of X-- back to the original probability measure
over the abstract events in the -algebra of the sample space.

Random Variables
Interpretation
The induced measure P
X
allows us to relate a measure on the real line
--the range of X-- back to the original probability measure over the
abstract events in the -algebra of the sample space:
P
X
[A] = P [X
-1
(A)] =P[{ : X() A}.

That is, we take the probability weights associated with events and
assign them to real numbers. Recall that when we deal with
probabilities on some random variable X, we are really dealing with
the P
X
measure.

We measure the "size" of the set (using P as our measure) of 's such
that the random variable X returns values in A.

Random Variables
Interpretation
P is the probability measure over the sample space and P
X
is the
probability measure over the range of the random variable.

Thus, we write P[A] (where A is a subset of the range of X) but we
mean P
X
[A], which is equivalent to P[{ : X() A}].

Notational shortcut: We use P[A] instead of P
X
[A]
(This notation can be misleading if there's confusion about whether
A is in the sample space or in the range of X.)
Random Variables
Example: Back to the previous example where two coins are tossed.
We defined the sample space () as all possible outcomes and the
sigma algebra of all possible subsets of the sample space. A simple
probability measure (P) was applied to the events in the sigma algebra.

Let the random variable X be "number of heads." Recall that X
takes into and induces P
X
from P. In this example, = {0; 1; 2}
and A = {; {0}; {1}; {2}; {0;1}; {0;2}; {1;2}; {0;1;2}}.

The induced probability measure P
X
from the measure defined above
would look like:
Random Variables
Example: Back to the previous example where two coins are tossed.
Prob. of 0 heads = P
X
[0] = P[{TT}] = 1/4
Prob. of 1 heads = P
X
[1] = P[{HT; TH}] = 1/2
Prob. of 2 heads = P
X
[2] = P[{HH}] =
Prob. of 0 or 1 heads = P
X
[{0; 1}] = P[{TT; TH; HT}] = 3/4
Prob. of 0 or 2 heads = P
X
[{0; 2}] = P[{TT; HH}] = 1/2
Prob. of 1 or 2 heads = P
X
[{1; 2}] = P[{TH; HT; HH}] = 3/4
Prob. of 1, 2, or 3 heads = P
X
[{0; 1; 2}] = P[{HH; TH; HT; TT}] = 1
Prob. of "nothing" = P
X
[] = P[] = 0

The empty set is simply needed to complete the -algebra. Its
interpretation is not important since P[] = 0 for any reasonable P.
Random Variables
Example: Probability Space
One standard probability space is the Borel field over the unit interval
of the real line under the Lebesgue measure . That is ([0; 1]; B; ).

The Borel field over the unit interval gives us a set of all possible
intervals taken from [0,1]. The Lebesgue measure measures the size of any
given interval. For any interval [a; b] in [0,1] with ba, [[a,b]] = b - a.

This probability space is well known: uniform distribution, the
probability of any interval of values is the size of the interval.
Random Variables
1. Two dice are rolled and X is the sum of the two upward faces.
2. A coin is tossed n = 3 times and X is the number of times that a
head occurs.
3. A point is selected at random from a square whose sides are of
length 1. X is the distance of the point from the lower left hand
corner.



4. X is the number of times the price of IBM increases during a
time interval, say a day.
5. Today, the DJ Index is 9,504.17, X is the value of the index in
thirty days.

point
X
Random Variables
is the sample space - the set of possible outcomes from an
experiment.
- An event A is a set containing outcomes from the sample space.
is a -algebra of subsets of the sample space. Think of as the
collection of all possible events involving outcomes chosen from .
P is a probability measure over . P assigns a number between [0,1]
to each event in .

We have functions (random variables) that allow us to look at real
numbers instead of abstract events in .
Random Variables: Summary
For each random variable X, there exists a new probability measure
P
X
: P
X
[A] where A R simply relates back to P[{ : X() A}.

We calculate P
X
[A], but we are really interested in the probability
P[{ : X() A},
where A simply represents { : X() A} through the inverse
transformation X
-1
.

Random Variables: Summary
Definition - The probability function, p(x), of a RV, X.
For any random variable, X, and any real number, x, we define


where {X = x} = the set of all outcomes (event) with X = x.
( ) | | { }
p x P X x P X x ( = = = =

Definition The cumulative distribution function (CDF), F(x), of a RV, X.
For any random variable, X, and any real number, x, we define


where {X x} = the set of all outcomes (event) with X x.

( ) | | { }
F x P X x P X x ( = =

Random Variables: Probability Function & CDF
Two dice are rolled and X is the sum of the two upward faces.
Sample space S = { 2:(1,1), 3:(1,2; 2,1), 4:(1,3; 3,1; 2,2), 5:(1,4; 2,3;
3,2; 4,1), 6, 7, 8, 9, 10, 11, 12}.
Graph: Probability function:

0.00
0.06
0.12
0.18
2 3 4 5 6 7 8 9 10 11 12
x
p(x)
Probability Function & CDF: Example I
( ) | | ( ) { }
1
2 2 1,1
36
p P X P
(
= = = =

( ) | | ( ) ( ) { }
2
3 3 1,2 , 2,1
36
p P X P
(
= = = =

( ) | | ( ) ( ) ( ) { }
3
4 4 1,3 , 2,2 , 3,1
36
p P X P
(
= = = =

( ) ( ) ( ) ( ) ( )
4 5 6 5 4
5 , 6 , 7 , 8 , 9
36 36 36 36 36
p p p p p = = = = =
( ) ( ) ( )
3 2 1
10 , 11 , 12
36 36 36
p p p = = =
( )
and 0 for all other p x x =
{ }
: for all other X x x = = Note
Probability function:
Probability Function & CDF: Example I
0
0.2
0.4
0.6
0.8
1
1.2
0 5 10
Graph: CDF
( )
F x =
1
36
3
36
6
36
10
36
15
36
21
36
26
36
30
36
33
36
35
36
0 2
2 3
3 4
4 5
5 6
6 7
7 8
8 9
9 10
10 11
11 12
12 1
x
x
x
x
x
x
x
x
x
x
x
x
<

<

<

<

<

<

<

<

<

<

<

F(x) is a step function


Probability Function & CDF: Example I
A point is selected at random from a square whose sides are of
length 1. X is the distance of the point from the lower le{T
hand corner.
An event, E, is any subset of the square, S.
P[E] = (area of E)/(Area of S) = area of E
point
X
S
E
Probability Function & CDF: Example II
Thus p(x) = 0 for all values of x. The probability function for this
example is not very informative.
( ) | |
set of all points a dist
0
from lower left corner
x
p x P X x P
(

= = = =
` (
)

S
The probability function is given by:
Probability Function & CDF: Example II
( ) | |
set of all points within a
dist from lower left corner
F x P X x P
x
(

= =
` (
)

The Cumulative distribution function is given by:
S
0 1 x 1 2 x
2 x
x
x
x
( ) | |
2
4
0
0
0 1
Area 1 2
1
2
x
x
x
F x P X x
A x
x

= =

S
0 1 x 1 2 x
2 x
x
x
x
A
Computation of Area A
1 2 x
x
A
x
1
1

2
2

2
1 x
2
1 x
( )
2
2 2 2
1 1
2 1
2 2 2
x
A x x x

| |

= + = + |
|
\ .
( )
2 2 2 1 2 2
1 1 tan 1
4 4
x x x x x


| | | |
= + = +
| |
\ . \ .
( )
2
tan 1 x =
( )
1 2
tan 1 x

=
0
1
-1 0 1 2
( ) | |
( )
2
2 1 2 2
0 0
0 1
4
1 tan 1 1 2
4
1 2
x
x
x
F x P X x
x x x x
x

= =

| |

+
|

\ .

( )
F x
Definition: Suppose that X is a random variable. Let f(x) denote a
function defined for - < x < with the following properties:

1. f(x) 0
( )
2. 1. f x dx

| | ( )
3. .
b
a
P a X b f x dx =

Then f(x) is called the probability density function of X.. The random
variable X is called continuous.
Random Variables: PDF for a Continuous RV
( )
1. f x dx

| | ( )
.
b
a
P a X b f x dx =

Random Variables: PDF for a Continuous RV


( )
F x
( ) | | ( )
.
x
F x P X x f t dt

= =

Random Variables: CDF for a Continuous RV
Thus if X is a continuous random variable with probability density
function, f(x), the cumulative distribution function of X is given by:
( ) | | ( )
.
x
F x P X x f t dt

= =

Also because of the fundamental theorem of calculus.
( )
( )
( )
dF x
F x f x
dx

= =
CDF and PDF for a Continuous RV: Relation
Example: Deriving a pdf from a CDF
A point is selected at random from a square whose sides are of
length 1. X is the distance of the point from the lower left hand
corner.
point
X
( ) | |
( )
2
2 1 2 2
0
0
0 1
4
1 2
1 tan 1
4
2
1
x
x
x
F x P X x
x
x x x
x

= =


| |

+
|

\ .

CDF and PDF for a Continuous RV: Relation


Now
( ) ( )
( )
2 1 2 2
0 0 or 2
0 1
2
1 tan 1 1 2
4
x x
x
f x F x x
d
x x x x
dx

= =

(
| |
+

| (
\ .

CDF and PDF for a Continuous RV: Relation


( )
2 1 2 2
1 tan 1
4
d
x x x
dx


(
| |
+
| (
\ .

( )
( )
( )
3
2
1
2
2
1 2
2
x x x

= +
( ) ( )
1 2 2 1 2
2 tan 1 tan 1
d
x x x x
dx

(

(

Also
( )
( )
3
2
1 2
2
2 tan 1
2
1
x
x x x
x

( )
2 1 2
tan 1
d
x x
dx

(

(

CDF and PDF for a Continuous RV: Relation
Now
( )
( )
( )
( )
3
2
1 2 2
2
1 1
tan 1 1 2
2
1 1
d
x x x
dx
x

| |
(
=
|
(

+ \ .
( )
1
2
1
tan
1
d
u
du u

(
=

+
and
( )
( )
3
2
2 1 2
2
tan 1
1
d x
x x
dx
x

(
=
(


( )
2 1 2 2
1 tan 1
4
d
x x x
dx


(
| |
+
| (
\ .

( )
1 2
2 tan 1
2
x x x


=
CDF and PDF for a Continuous RV: Relation
Finally
( ) ( )
( )
1 2
0 0 or 2
0 1
2
2 tan 1 1 2
2
x x
x
f x F x x
x x x x

= =

CDF and PDF for a Continuous RV: Relation


Graph of f(x)
0
0.5
1
1.5
2
-1 0 1 2
CDF and PDF for a Continuous RV: Relation
Discrete and Continuous Random
Variables
A Discrete Random Variable
A random variable X is called discrete if
All the probability is accounted for by values, x, such that p(x) > 0.

For a discrete random variable X the probability distribution is
described by the probability function p(x), which has the following
properties:

( ) ( )
1
1
i
x i
p x p x

=
= =

( )
1. 0 1 p x
( ) ( )
1
2. 1
i
x i
p x p x

=
= =

| | ( )
3.
a x b
P a x b p x

=

p(x)
| | ( )
a x b
P a x b p x

=

a
b
A Discrete Random Variable: Graph
Recall
p(x) = P[X = x] = the probability function of X.
This can be defined for any random variable X.

For a continuous random variable
p(x) = 0 for all values of X.

Let S
X
={x| p(x) > 0}. This set is countable -i. e., it can be put into
a 1-1 correspondence with the integers.
S
X
={x| p(x) > 0}= {x
1
, x
2
, x
3
, x
4
, }

Thus, we can write
( ) ( )
1
i
x i
p x p x

=
=

Discrete Random Variables: Details
Proof: (that the set S
X
={x| p(x) > 0} is countable -i. e., it can be put
into a 1-1 correspondence with the integers.)
S
X
= S
1
S
2
S
3
S
3

where
( )
1 1
1
i
S x p x
i i

= <
`
+
)
( ) ( )
1 1
1
1 Note: 2
2
S x p x n S

= < <
`
)
That is,
( ) ( )
2 3
1 1
Note: 3
3 2
S x p x n S

= < <
`
)
( ) ( )
3 3
1 1
Note: 4
4 3
S x p x n S

= < <
`
)
( )
Thus the number of elements of 1 (is finite)
i i
S n S i = < +
Discrete Random Variables: Details
Thus the elements of S
X
= S
1
S
2
S
3
S
3

can be arranged {x
1
, x
2
, x
3
, x
4
, }
by choosing the first elements to be the elements of S
1
,

the next elements to be the elements of S
2
,
the next elements to be the elements of S
3
,
the next elements to be the elements of S
4
,
etc
This allows us to write
( ) ( )
1
for
i
x i
p x p x

=

Discrete Random Variables: Details
Continuous Random Variables: PDF
For a continuous random variable X, the probability distribution is
described by the probability density function f(x), which has the
following properties :
1. f(x) 0
( )
2. 1. f x dx

| | ( )
3. .
b
a
P a X b f x dx =

probability density function, f(x)


( )
1. f x dx

| | ( )
.
b
a
P a X b f x dx =

Continuous Random Variables: PDF Graph


A Probability distribution is similar to a distribution of mass.
A Discrete distribution is similar to a point distribution of mass.
=> Positive amounts of mass are put at discrete points.

x
1
x
2

x
3

x
4

p(x
1
)
p(x
2
)
p(x
3
)
p(x
4
)
Discrete & Continuous Random Variables
A Continuous distribution is similar to a continuous distribution of
mass.
The total mass of 1 is spread over a continuum. The mass assigned to
any point is zero but has a non-zero density
f(x)
Discrete & Continuous Random Variables
Distribution function F(x): Properties
This is defined for any random variable, X:
F(x) = P[X x]
Properties
1. F(-) = 0 and F() = 1.
Since {X - }= and {X } =S => F(- ) = 0 and F() = 1.

2. F(x) is non-decreasing (i. e., if x
1
< x
2
then F(x
1
) F(x
2
) )
If x
1
< x
2
then {X x
2
} = {X x
1
} {x
1
< X x
2
}
Thus P[X x
2
] = P[X x
1
] + P[x
1
< X x
2
]
or F(x
2
) = F(x
1
) + P[x
1
< X x
2
]
Since P[x
1
< X x
2
] 0 then F(x
2
) F(x
1
).

3. F(b) F(a) = P[a < X b].
If a < b

then using the argument above
F(b) = F(a) + P[a < X b]
=> F(b) F(a) = P[a < X b].

4. p(x) = P[X = x] =F(x) F(x-)
Here

( ) ( )
lim
u x
F x F u

=
5. If p(x) = 0 for all x (i.e., X is continuous) then F(x) is
continuous. A function F is continuous if

( ) ( ) ( ) ( )
lim lim
u x u x
F x F u F x F u
+
= = + =
One can show that p(x) = 0 implies
( ) ( ) ( )
F x F x F x + = =
Distribution function F(x): Properties
F(x) is a non-decreasing step function with
( ) | | ( )
u x
F x P X x p u

= =

( ) ( ) ( ) ( )
jump in at . p x F x F x F x x = =
( ) ( )
0 and 1 F F = =
0
0.2
0.4
0.6
0.8
1
1.2
-1 0 1 2 3 4
F(x)
p(x)
Distribution function F(x): Discrete RV

F(x) is a non-decreasing continuous function with
( ) | | ( )
x
F x P X x f u du

= =

( ) ( )
. f x F x

=
( ) ( )
0 and 1 F F = =
F(x)
f(x) slope
0
1
-1 0 1 2
x
Distribution function F(x): Continuous RV
Some Important Discrete
Distributions
The Binomial distribution
Jacob Bernoulli (1654 1705)
Suppose that we have a Bernoulli trial (an experiment) that has 2 results:
1. Success (S)
2. Failure (F)

Suppose that p is the probability of success (S) and q = 1 p is the
probability of failure (F). Then, the probability distribution with
probability function
( ) | |
0
1
q x
p x P X x
p x
=

= = =

=

is called the Bernoulli distribution.



Now assume that the Bernoulli trial is repeated independently n times.
Let X be the number of successes ocurring in the n trials. (The possible
values of X are {0, 1, 2, , n})
Bernouille Distribution
Suppose we have n = 5 the outcomes together with the values of X and
the probabilities of each outcome are given in the table below:
FFFFF
0
q
5

SFFFF
1
pq
4

FSFFF
1
pq
4
FFSFF
1
pq
4
FFFSF
1
pq
4
FFFFS
1
pq
4
SSFFF
2
p
2
q
3
SFSFF
2
p
2
q
3
SFFSF
2
p
2
q
3

SFFFS
2
p
2
q
3
FSSFF
2
p
2
q
3
FSFSF
2
p
2
q
3
FSFFS
2
p
2
q
3
FFSSF
2
p
2
q
3
FFSFS
2
p
2
q
3
FFFSS
2
p
2
q
3
SSSFF
3
p
3
q
2
SSFSF
3
p
3
q
2
SSFFS
3
p
3
q
2
SFSSF
3
p
3
q
2
SFSFS
3
p
3
q
2
SFFSS
3
p
3
q
2
FSSSF
3
p
3
q
2
FSSFS
3
p
3
q
2
FSFSS
3
p
3
q
2

FFSSS
3
p
3
q
2
SSSSF
4
p
4
q
SSSFS
4
p
4
q
SSFSS
4
p
4
q
SFSSS
4
p
4
q
FSSSS
4
p
4
q
SSSSS
5
p
5

The Binomial Distribution
For n = 5 the following table gives the different possible values of X,
x, and p(x) = P[X = x]
x 0 1 2 3 4 5
p(x) = P[X = x]
q
5
5pq
4
10p
3
q
2
10p
2
q
3
5p
4
q p
5
For general n, the outcome of the sequence of n Bernoulli trials
is a sequence of Ss and Fs of length n:
SSFSFFSFFFFSSSFFSFSFFS
The value of X for such a sequence is k = the number of Ss in
the sequence.
The probability of such a sequence is p
k
q
n k
( a p for each S and
a q for each F)

There are such sequences containing exactly k Ss
n
k
| |
|
\ .
The Binomial Distribution
( ) | |
0,1,2,3, , 1,
k n k
n
p k P X k p q k n n
k

| |
= = = =
|
\ .

These are the terms in the expansion of (p + q)
n
using the
Binomial Theorem
( )
0 1 1 2 2 0
0 1 2
n
n n n n
n n n n
p q p q p q p q p q
n

| | | | | | | |
+ = + + + +
| | | |
\ . \ . \ . \ .

For this reason the probability function


( ) | |
0,1,2, ,
x n x
n
p x P X x p q x n
x

| |
= = = =
|
\ .

is called the probability function for the Binomial distribution
is the number of ways of selecting the k positions for the Ss (the
remaining n k positions are for the Fs). Thus,
n
k
| |
|
\ .
The Binomial Distribution
Summary
We observe a Bernoulli trial (S,F) n times. Let X denote the
number of successes in the n trials.
Then, X has a binomial distribution:
( ) | |
0,1,2, ,
x n x
n
p x P X x p q x n
x

| |
= = = =
|
\ .

where
1. p = the probability of success (S), and
2. q = 1 p = the probability of failure (F)
The Binomial Distribution
Example
If a firm announces profits and they are surprising, the chance of a
stock price increase is 85%. Assume there are n=20 (independent)
announcements.
Let X denote the number of increases in the stock price following
surprising announcements in the n = 20 trials.
Then, X has a binomial distribution, with p = 0.85 and n = 20.

( ) | |
0,1,2, ,
x n x
n
p x P X x p q x n
x

| |
= = = =
|
\ .

Thus
( ) ( )
20
20
.85 .15 0,1,2, ,20
x x
x
x

| |
= =
|
\ .

The Binomial Distribution
-
0.0500
0.1000
0.1500
0.2000
0.2500
0.3000
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
p(x)
x
x 0 1 2 3 4 5
p (x )
0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
x 6 7 8 9 10 11
p (x )
0.0000 0.0000 0.0000 0.0000 0.0002 0.0011
x 12 13 14 15 16 17
p (x )
0.0046 0.0160 0.0454 0.1028 0.1821 0.2428
x 18 19 20
p (x )
0.2293 0.1368 0.0388
The Poisson distribution
Simon Denis Poisson (1781-1840)
The Poisson distribution
Suppose events are occurring randomly and uniformly in time.
The events occur with a known average.
Let X be the number of events occurring (arrivals) in a fixed period of
time (time-interval of given length).
Typical example: X = number of crime cases coming before a
criminal court per year (original Poissons application in 1838.)

Then, X will have a Poisson distribution with parameter .



The parameter represents the expected number of occurrences in a
fixed period of time. The parameter is a positive real number.
( )
0,1,2,3,4,
!
x
p x e x
x


= =
The Poisson distribution
Example:
On average, a trade occurs every 15 seconds. Suppose trades are
independent. We are interested in the probability of observing 10
trades in a minute (X=10). A Poisson distribution can be used with
=4 (4 trades per minute).

Poisson probability function

( )
1 e e

= =
2 3 4
0
1
! 2! 3! 4!
x
x
e e
x


=
| |
= + + + + +
|
\ .


2 3 4
using 1
2! 3! 4!
u
u u u
e u = + + + + +
Thus
2. If

is the probability function for the Binomial distribution with parameters
n and p. Let n and p 0 such that np = a constant (=, say) then
( ) ( )
, 1
n x
x
Bin
n
p x p n p p
x

| |
=
|
\ .
( ) ( )
, 0
lim ,
!
x
Bin Poisson
n p
p x p n p x e
x



= =
( )
0 0
1. 1
!
x
x x
p x e
x

= =
= =

2 3 4
0
1
! 2! 3! 4!
x
x
e e
x


=
| |
= + + + + +
|
\ .


Properties:
Proof:
( )
( )
, 1
n x
x
Bin
n
p x p n p p
x

| |
=
|
\ .
Suppose or np p
n

= =
( ) ( )
( )
!
, , 1
! !
x n x
Bin Bin
n
p x p n p x n
x n x n n


| | | |
= =
| |

\ . \ .
( )
!
1 1
! !
x n
x
x
n
x n n x n n

| | | |
=
| |

\ . \ .
( ) ( )
1 1
1 1
!
x n
x
n n n x
x n n n n n

+
| | | |
=
| |

\ . \ .

1 1
1 1 1 1 1
!
x n
x
x
x n n n n

+
| | | || | | |
=
| | | |
\ . \ .\ . \ .

Now
( )
lim ,
Bin
n
p x n

1 1
lim 1 1 1 1
!
x n
x
n
x
x n n n n


+
| | | || | | |
=
`
| | | |
\ . \ .\ . \ .

)

lim 1
!
n
x
n
x n

| |
=
|
\ .
Using the classic limit lim 1
n
u
n
u
e
n

| |
+ =
|
\ .
( ) ( )
lim , lim 1
! !
n
x x
Bin Poisson
n n
p x n e p x
x n x


| |
= = =
|
\ .
Note: In many applications, when n is large and p is very small --and the
expectation np is not big. Then, the binomial distribution may be
approximated by the easier Poisson distribution. This is called the law of
rare events, since each of the n individual Bernoulli events rarely occurs.
Suppose a time interval is divided into n equal parts and that one event
may or may not occur in each subinterval.
time interval
n subintervals
- Event occurs
- Event does not occur
As n , events can occur over the continuous time interval.
X = # of events is Bin(n,p)
X = # of events is Poisson()
The Poisson distribution: Graphical Illustration
The Poisson distribution: Comments
The Poisson distribution arises in connection with Poisson processes
- a stochastic process in which events occur continuously and
independently of one another.
It occurs most easily for time-events; such as the number of calls
passing through a call center per minute, or the number of visitors
passing through a turnstile per hour. However, it can apply to any
process in which the mean can be shown to be constant.
It is used in finance (number of jumps in an asset price in a given
interval); market microstructure (number of trades per unit of time in a
stock market); sports economics (number of goals in sports involving two
competing teams); insurance (number of a given disaster -volcano
eruptions/hurricanes/floods- per year); etc.

Poisson Distribution - Example: Hurricanes
The number of Hurricanes over a period of a year in the Caribbean
is known to have a Poisson distribution with = 13.1

Determine the probability function of X.
Compute the probability that X is at most 8.
Compute the probability that X is at least 10.
Given that at least 10 hurricanes occur, what is the probability that
X is at most 15?

Solution:
( )
0,1,2,3,4,
!
x
p x e x
x


= =
13.1
13.1
0,1,2,3,4,
!
x
e x
x

= =
Table of p(x)
x
p (x )
x
p (x )
0 0.000002 10 0.083887
1 0.000027 11 0.099901
2 0.000175 12 0.109059
3 0.000766 13 0.109898
4 0.002510 14 0.102833
5 0.006575 15 0.089807
6 0.014356 16 0.073530
7 0.026866 17 0.056661
8 0.043994 18 0.041237
9 0.064036 19 0.028432
Poisson Distribution - Example: Hurricanes
| | | |
at most 8 8 P P X =
( ) ( ) ( )
0 1 8 .09527 p p p = + + + =
| | | | | |
at least 10 10 1 9 P P X P X = =
( ) ( ) ( ) ( )
1 0 1 9 .8400 p p p = + + + =
at most 15at least 10 15 10 P P X X ( ( =

{ } { }
| |
| |
| |
15 10
10 15
10 10
P X X
P X
P X P X
(


= =

( ) ( ) ( )
10 11 15
0.708
.8400
p p p + + +
= =

Poisson Distribution - Example: Hurricanes

Vous aimerez peut-être aussi