Vous êtes sur la page 1sur 17

37161: Probability and

Random Variables

Lecture 7

UTS CRICOS PROVIDER CODE: 00099F UTS:SCIENCE


science.uts.edu.au
LECTURE 6 RECAP
 Poisson Processes

- Merging of independent Poisson processes


- Splitting of Poisson processes
- Relative arrival times of independent Poisson processes

 Exponential variables

 No memory property

science.uts.edu.au
SUMS OF RANDOM VARIABLES
 Consider now two independent random variables which arise from flipping two fair coins a fixed
number of times.

 If Coin A is flipped 5 times, the number of times it lands Tails is N A ~ Bin(5,0.5) .

 If Coin B is flipped 10 times, the number of times it lands Tails is NB ~ Bin(10,0.5) .

 What is the distribution of the total number of times either of the two coins land Tails, N A  NB ?

 It is easy to see that N A  NB  ~ Bin(15,0.5) , but proving this is less trivial.

science.uts.edu.au
SUMS OF RANDOM VARIABLES
 We have previously see, that the distribution of sums of random variables can be found via convolutions.

k
 For example, P N A  NB   k    P (N A  j )P (NB  k  j )
j 0
k
5! 10!
 0.5 j 0.55 j 0.5k  j 0.510( k  j )
j 0 (5  j )! j ! (10  ( k  j ))!)( k  j )!

 This is, however, not easy to work with and justifying that the above statement simplifies to
15!
0.515 requires knowing several identities regarding the summation of binomial coefficients.
k !(15  k )!

 Adding more than two random variables is even messier. For example, for independent variables
X1, X 2 , X 3 , X 4 , the probability that  X1  X 2  X 3  X 4   k requires summing over possible ways this
k k n k m n
could happen  X1  X 2  X 3  X 4   k     P ( X1  j )P ( X 2  m )P ( X 3  n )P ( X 4  k  j  m  n )
n 0 m 0 j 0

science.uts.edu.au
GENERATING FUNCTIONS
 Quite clearly, the convolution approach is not practical for any large
sum of random variables.

 Instead, many such calculations can be more easily done with


generating functions.

 These are transformations of the probability mass function or probability density function of the variable,
such that some key properties of the variable can still be recovered.

The generating function of a random variable X is defined as g X ( z )  E ( z ).


X

If X is discrete, g X ( z )  E ( z X )   P ( X  k )z k . If X is continuous g X ( z )  E ( z )   z f ( x )dx .


X x

science.uts.edu.au
GENERATING FUNCTIONS: EXAMPLE
 Consider rolling one regular fair six sided die.
1/ 6 k 1
1/ 6 k 2

1/ 6 k 3

 The probability mass function for this is P ( X  k )  1/ 6 k 4
1/ 6 k 5

1/ 6 k 6

 0 otherwise

1 1 1 2 1 3 1 4 1 5 1 6
 The generating function is therefore g X (z)  E(z X )   P( X  k )zk  z  z  z  z  z  z
6 6 6 6 6 6

science.uts.edu.au
EXPECTATION AND VARIANCE
 If given the generating function of a variable, how do we obtain the expectation or variance of the
underlying variable?
We know that g X (z)  E(z )   P( X  k )z . Differentiating once gives dg X ( z )   P ( X  k )kz k 1 .
X k

dz

E ( X )   P ( X  k )k , so this is obtained by finding g X (1)   P ( X  k )k (1 ) .


k 1

2 
Similarly, differentiating a second time gives d g X  k (k  1)P ( X  k )z k 2

dz 2

k 0
 
  k P ( X  k )z
2 k 2
  kP ( X  k )z k 2
k 0 k 0

Again, setting z  1 gives g X (1)   P ( X  k )k (1 )   P ( X  k )k (1 )  E ( X )  E ( X )


2 k 2 2k 2

 Together, we have E ( X )  g X (1) and Var( X )  E ( X 2 )  E ( X )2  g X (1)  g X (1)  g X (1)2

science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
 Returning now to the problem of summing random variables, let X1, X 2 be independent random variables
both taking non-negative integer values.

k
 Let Y  X1  X 2 , then P (Y  k )   P ( X1  j )P ( X 2  k  j ) .
j 0

  k 
 k
 g
Now, consider the generating function of Y, Y ( z )   P (Y  k )z   
k 0  j 0
P ( k
X 1  j ) P ( X 2  k  j ) z
k 0 

 Since X1 and X 2 are independent, we can split the summation



 k  
 k 
gY ( z )     P ( X1  j ) P ( X 2  k  j )z k     P ( X 1  j )z j P ( X 2  k  j )z k  j 
k 0  j 0  k 0  j 0 

  g X1 ( z )P ( X 2  k  j )z k  j   g X1 ( z )g X 2 ( z )
k 0

 That is, for Y  X1  X 2 , gY ( z )  g X1 ( z )g X 2 ( z ).

science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
 The fact that we can calculate sums of independent random variable through multiplication of their
generating functions is perhaps unsurprising, since gY (z)  E(z X1 X2 )  E(z X1 )E(z X2 ) .

 This also gives us simple and quick methods of verifying relationships we have already seen.

 Let X1,..., X n ~ Poi ( λ) be n independent Poisson variables.


e  λ λk 
e  λ ( zλ)k
 Each of these has generating function g X i  E(z )   z
Xi k
 .
k 0 k! k 0 k!

e  λ λk 
λk 
e  λ ( zλ)k
 We know   1 (since it is a probability mass function) hence  e λ so   e  λ (e zλ )
k 0 k! k 0 k ! k 0 k!

science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
λ ( z 1)
 We now know that if X i ~ Poi ( λ) , its generating function is g X i ( z )  e .

 So, for Y  X1  ...  X n , we have gY ( z )  E ( zY )  E ( z X1... X n )  E ( z X1 )E ( z X 2 )...E ( z X n ) .

λ( z 1)
 Each of E ( z X1 ), E ( z X 2 ),..., E ( z X n ) is the same function, equal to g Xi (z)  e .

n n
The generating function of Y is therefore gY ( z )  g X i ( z )  e   e nλ( z 1)
λ ( z 1)

 We therefore have that, if X i ~ Poi ( λ) and each variable is independent, then Y  X1  ...  X n ~ Poi (nλ)

 This is equivalent to (but simpler in its derivation) what we saw in Lecture 6 regarding merging
independent Poisson processes.

science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
 Let T1,...,Tn ~ Bern( p ) be n independent Bernoulli variables.

Each of these has generating function gTi  E ( z i )  z P (Ti  0)  z P (Ti  1)  (1  p )  pz .


T 0 1

 S  T1  ...  Tn is therefore a binomial variable,S ~ Bin(n, p ) .

 What is the generating function of S, gS ( z ) ?

 As S is the sum of n independent Bernoulli variables, its generating function is the product of the n
n
generating functions of these variables, gS ( z )  gTi ( z ) .

We therefore have gS ( z )  (1  p )  pz  .


n

science.uts.edu.au
GENERATING FUNCTIONS: GEOMETRIC VARIABLES
 Let X ~ Geo( p ) be a Geometric random variable.

(1  p )k 1 p k  1,2,3,...
 The probability mass function is therefore P ( X  k )  
 0 otherwise
 
This gives the generation function g X ( z )  E ( z )   z P ( X  k )   z (1  p ) p
X k k k 1

k 1 k 1

 zp  z 2 p(1  p )  z 3 p(1  p )2  z 4 p(1  p )3  ...

 This is a geometric series with first term zp and common ratio (1  p )z .

zp
 (Provided z is chosen such that (1  p )z  1 ) this sums to give g X ( z )  .
1  (1  p )z

science.uts.edu.au
GENERATING FUNCTIONS: EXPONENTIAL VARIABLES
 Let Y ~ exp( λ) be an exponential random variable.

λe  λy y  [0,  )
 The probability density function is therefore f ( y )  
 0 otherwise
 
This gives the generation function gY ( z )  E ( z )   z y f ( y )dy 
 z y λe  λy dy
Y

0 0

 In order to evaluate this, we need to combine z y e  λy into a single exponential.


y
As, for any positive X, X  e , we can write z  e  e y ln( z ).
ln( X ) y ln( z )

   λ
 This gives gY ( z )  E ( zY ) 

0
z y f ( y )dy 

0
e y ln( z ) λe  λy dy  λ
0
e  y ( λln( z ))dy 
λ  ln( z )

λ
 The generating function of Y ~ exp( λ) is therefore gY ( z )  .
λ  ln( z )
science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
 For many applications, we are interested in the sum of a number of random variables where the number
of variables to be summed is itself random.

 For example, the total annual payouts for an insurance company varies according to two variables – the
average size of a claim and the total number of individual claims.

 The total payout is the sum of each individual claim, summed over an uncertain number of claims.

science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
 Consider now the problem where N is a random variable taking non-negative integer values and
X1, X 2,..., X N are independent, identically distributed random variables.

N
 Let SN be the sum of these variables i.e. SN   X k .
k 1

The generating function of SN is therefore gSN (z)  E(z N ) .


S

 If we knew the value of N, say N  k , this would be easy to evaluate. Since X1, X 2,..., X N all have the
k
same distribution, we would simply be multiplying k identical generating functions gSN ( z )  g X i ( z )

science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
 We know that for variables A and B1,...Bn then, if the Bi variables form a partition (i.e. exactly one of
n n
them must happen) then P ( A)   P ( A Bi )  P ( A Bi ) P (Bi ) .
i 1 i 1

 
Applying this result to gSN ( z ) we get gSN ( z )  E ( z )   E ( z N  k )P (N  k )   g xi ( z )  P (N  k )
k
SN SN

k 0 k 0

 Now, the last term is itself a generating function applied to a generating function. In general, for any non-

negative discrete variable Q, gQ ( z )  E ( z )   z P (Q  k ) so we therefore have that gSN ( z )  g N (g xi ( z )).
Q k

k 0

 In other words, if we are adding N independent identically distributed variables X1, X 2,..., X N then the
generating function of the sum is equal to the generating function of N evaluated when z equals the
generating function of each of the X i variables.

science.uts.edu.au
EXAMPLE: POISSON HEN
 Consider a hen which lays N eggs, where N ~ Poi ( λ) and each egg hatches
to produce one chicken with probability p, independent of all other eggs.

 The number of chickens from each single egg is therefore X i ~ Bern( p ) .

 What is the distribution of the number of chickens hatching from all eggs, SN ?

 We already know that g X i ( z )  (1  p )  pz and g N ( z )  e λ( z 1)

These therefore give gSN ( z )  gN (g xi (z ))  e λ g Xi 1  e λ(1 p ) pz 1  e pλ( z 1)


 

 Since the generating function of SN is e pλ( z 1) , we can recognise this as a Poisson variable, SN ~ Poi ( pλ)

science.uts.edu.au

Vous aimerez peut-être aussi