Académique Documents
Professionnel Documents
Culture Documents
Random Variables
Lecture 7
Exponential variables
No memory property
science.uts.edu.au
SUMS OF RANDOM VARIABLES
Consider now two independent random variables which arise from flipping two fair coins a fixed
number of times.
What is the distribution of the total number of times either of the two coins land Tails, N A NB ?
science.uts.edu.au
SUMS OF RANDOM VARIABLES
We have previously see, that the distribution of sums of random variables can be found via convolutions.
k
For example, P N A NB k P (N A j )P (NB k j )
j 0
k
5! 10!
0.5 j 0.55 j 0.5k j 0.510( k j )
j 0 (5 j )! j ! (10 ( k j ))!)( k j )!
This is, however, not easy to work with and justifying that the above statement simplifies to
15!
0.515 requires knowing several identities regarding the summation of binomial coefficients.
k !(15 k )!
Adding more than two random variables is even messier. For example, for independent variables
X1, X 2 , X 3 , X 4 , the probability that X1 X 2 X 3 X 4 k requires summing over possible ways this
k k n k m n
could happen X1 X 2 X 3 X 4 k P ( X1 j )P ( X 2 m )P ( X 3 n )P ( X 4 k j m n )
n 0 m 0 j 0
science.uts.edu.au
GENERATING FUNCTIONS
Quite clearly, the convolution approach is not practical for any large
sum of random variables.
These are transformations of the probability mass function or probability density function of the variable,
such that some key properties of the variable can still be recovered.
science.uts.edu.au
GENERATING FUNCTIONS: EXAMPLE
Consider rolling one regular fair six sided die.
1/ 6 k 1
1/ 6 k 2
1/ 6 k 3
The probability mass function for this is P ( X k ) 1/ 6 k 4
1/ 6 k 5
1/ 6 k 6
0 otherwise
1 1 1 2 1 3 1 4 1 5 1 6
The generating function is therefore g X (z) E(z X ) P( X k )zk z z z z z z
6 6 6 6 6 6
science.uts.edu.au
EXPECTATION AND VARIANCE
If given the generating function of a variable, how do we obtain the expectation or variance of the
underlying variable?
We know that g X (z) E(z ) P( X k )z . Differentiating once gives dg X ( z ) P ( X k )kz k 1 .
X k
dz
2
Similarly, differentiating a second time gives d g X k (k 1)P ( X k )z k 2
dz 2
k 0
k P ( X k )z
2 k 2
kP ( X k )z k 2
k 0 k 0
science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
Returning now to the problem of summing random variables, let X1, X 2 be independent random variables
both taking non-negative integer values.
k
Let Y X1 X 2 , then P (Y k ) P ( X1 j )P ( X 2 k j ) .
j 0
k
k
g
Now, consider the generating function of Y, Y ( z ) P (Y k )z
k 0 j 0
P ( k
X 1 j ) P ( X 2 k j ) z
k 0
science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
The fact that we can calculate sums of independent random variable through multiplication of their
generating functions is perhaps unsurprising, since gY (z) E(z X1 X2 ) E(z X1 )E(z X2 ) .
This also gives us simple and quick methods of verifying relationships we have already seen.
e λ λk
e λ ( zλ)k
Each of these has generating function g X i E(z ) z
Xi k
.
k 0 k! k 0 k!
e λ λk
λk
e λ ( zλ)k
We know 1 (since it is a probability mass function) hence e λ so e λ (e zλ )
k 0 k! k 0 k ! k 0 k!
science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
λ ( z 1)
We now know that if X i ~ Poi ( λ) , its generating function is g X i ( z ) e .
λ( z 1)
Each of E ( z X1 ), E ( z X 2 ),..., E ( z X n ) is the same function, equal to g Xi (z) e .
n n
The generating function of Y is therefore gY ( z ) g X i ( z ) e e nλ( z 1)
λ ( z 1)
We therefore have that, if X i ~ Poi ( λ) and each variable is independent, then Y X1 ... X n ~ Poi (nλ)
This is equivalent to (but simpler in its derivation) what we saw in Lecture 6 regarding merging
independent Poisson processes.
science.uts.edu.au
GENERATING FUNCTIONS: SUMS OF VARIABLES
Let T1,...,Tn ~ Bern( p ) be n independent Bernoulli variables.
As S is the sum of n independent Bernoulli variables, its generating function is the product of the n
n
generating functions of these variables, gS ( z ) gTi ( z ) .
science.uts.edu.au
GENERATING FUNCTIONS: GEOMETRIC VARIABLES
Let X ~ Geo( p ) be a Geometric random variable.
(1 p )k 1 p k 1,2,3,...
The probability mass function is therefore P ( X k )
0 otherwise
This gives the generation function g X ( z ) E ( z ) z P ( X k ) z (1 p ) p
X k k k 1
k 1 k 1
zp
(Provided z is chosen such that (1 p )z 1 ) this sums to give g X ( z ) .
1 (1 p )z
science.uts.edu.au
GENERATING FUNCTIONS: EXPONENTIAL VARIABLES
Let Y ~ exp( λ) be an exponential random variable.
λe λy y [0, )
The probability density function is therefore f ( y )
0 otherwise
This gives the generation function gY ( z ) E ( z ) z y f ( y )dy
z y λe λy dy
Y
0 0
λ
This gives gY ( z ) E ( zY )
0
z y f ( y )dy
0
e y ln( z ) λe λy dy λ
0
e y ( λln( z ))dy
λ ln( z )
λ
The generating function of Y ~ exp( λ) is therefore gY ( z ) .
λ ln( z )
science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
For many applications, we are interested in the sum of a number of random variables where the number
of variables to be summed is itself random.
For example, the total annual payouts for an insurance company varies according to two variables – the
average size of a claim and the total number of individual claims.
The total payout is the sum of each individual claim, summed over an uncertain number of claims.
science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
Consider now the problem where N is a random variable taking non-negative integer values and
X1, X 2,..., X N are independent, identically distributed random variables.
N
Let SN be the sum of these variables i.e. SN X k .
k 1
If we knew the value of N, say N k , this would be easy to evaluate. Since X1, X 2,..., X N all have the
k
same distribution, we would simply be multiplying k identical generating functions gSN ( z ) g X i ( z )
science.uts.edu.au
SUM OF A RANDOM NUMBER OF RANDOM VARIABLES
We know that for variables A and B1,...Bn then, if the Bi variables form a partition (i.e. exactly one of
n n
them must happen) then P ( A) P ( A Bi ) P ( A Bi ) P (Bi ) .
i 1 i 1
Applying this result to gSN ( z ) we get gSN ( z ) E ( z ) E ( z N k )P (N k ) g xi ( z ) P (N k )
k
SN SN
k 0 k 0
Now, the last term is itself a generating function applied to a generating function. In general, for any non-
negative discrete variable Q, gQ ( z ) E ( z ) z P (Q k ) so we therefore have that gSN ( z ) g N (g xi ( z )).
Q k
k 0
In other words, if we are adding N independent identically distributed variables X1, X 2,..., X N then the
generating function of the sum is equal to the generating function of N evaluated when z equals the
generating function of each of the X i variables.
science.uts.edu.au
EXAMPLE: POISSON HEN
Consider a hen which lays N eggs, where N ~ Poi ( λ) and each egg hatches
to produce one chicken with probability p, independent of all other eggs.
What is the distribution of the number of chickens hatching from all eggs, SN ?
Since the generating function of SN is e pλ( z 1) , we can recognise this as a Poisson variable, SN ~ Poi ( pλ)
science.uts.edu.au