Académique Documents
Professionnel Documents
Culture Documents
% &
&
! .
~cB !~cB
In the continuous case, C%C C& - % & ~ % & .
% &
B B
,? @ ~
cB
cB % & h % & & % in the continuous case.
cB
&
in the continuous case. The density function for the marginal distribution of @ is found in a
B
similar way ; @ & is equal to either @ & ~ % & or @ & ~
cB % & % .
If the cumulative distribution function of the joint distribution of ? and @ is - % &, then
-? % ~ lim - % & and -@ & ~ lim - % & .
&B
%B
This can be extended to define the marginal distribution of any one (or subcollection) variable in
a multivariate distribution.
Independence of random variables ? and @ : Random variables ? and @ with cumulative
distribution functions -? % and -@ & are said to be independent (or stochastically
independent) if the cumulative distribution function of the joint distribution - % & can be
factored in the form - % & ~ -? % h -@ & for all % &. This
definition can be extended to a multivariate distribution of more than 2 variables.
If ? and @ are independent, then % & ~ ? % h @ & , (but the reverse implication
is not always true, i.e. if the joint distribution probability or density function can be factored in
the form % & ~ ? % h @ & then ? and @ are usually, but not always, independent).
Conditional distribution of @ given ? ~ %: Suppose that the random variables ? and @ have
joint density/probability function % &, and the density/probability function of the marginal
distribution of ? is ? %. The density/probability function of the conditional distribution of @
%&
given ? ~ % is @ O? &O? ~ % ~ % , if ? % .
?
B
The conditional expectation of @ given ? ~ % is ,@ O? ~ % ~
cB & h @ O? &O? ~ % & in
the continuous case, and ,@ O? ~ % ~ & h @ O? &O? ~ % in the discrete case.
*#?@
? @
@ respectively.
Moment generating function of a joint distribution: Given jointly distributed random
variables ? and @ , the moment generating function of the joint distribution is
4?@ ! ! ~ ,! ?b! @ . This definition can be extended to the joint distribution of
any number of random variables.
For example, the toss of a fair die results in one of ~ 6 outcomes, with probabilities
~
for ~
. If the die is tossed times, then with
%cB
&cB
~ ,? , C!C 4?@ ! ! e
~ ,@
! ~! ~
! ~! ~
b
C
C ! C ! 4?@ ! ! e! ~! ~ ~ ,? h @
(xv) A random variable ? can be defined as a combination of two (or more) random
variables ? and ? , defined in terms of whether or not a particular event ( occurs.
?~F
if ( occurs (prob. )
if ( doesn't occur (prob. c)
Example 116: If % & ~ 2% b & is the density function for the joint distribution of the
continuous random variables ? and @ defined over the unit square bounded by the points
and , find 2 .
Solution: The (double) integral of the density function over the region of density must be 1, so
that ~
2% b & & % ~ 2 h S 2 ~ .
U
Example 117: The cumulative distribution function for the joint distribution of the continuous
random variables ? and @ is - % & ~ % & b % & , for % and & .
Find .
Solution: % & ~ C%C C& - % & ~
% b %& S ~
Example 118: ? and @ are discrete random variables which are jointly distributed
with the following probability function % &:
?
c
@
c
Find ,? h @ .
Solution: ,?@ ~ %& h % & ~ c
b c
b c c
% &
b
b b c
b
b
b c
~
.
Example 119: Continuous random variables ? and @ have a joint distribution with density
function % & ~
c%c&
in the region bounded by & ~ % ~
and & ~ c %. Find the density function for the marginal distribution of ? for % .
Solution: The region of joint density is illustrated
in the graph at the right. Note that ? must be in the
interval and @ must be in the interval .
B
Since ? % ~
cB % & & , we note that
given a value of % in , the possible values of &
(with non-zero density for % &) must satisfy
& c %, so that
c% % & &
? % ~
~
Example 120: Suppose that ? and @ are independent continuous random variables with the
following density functions - ? % ~ for % and
@ & ~ & for & . Find 7 @ ?.
Solution: Since ? and @ are independent, the
density function of the joint distribution of ? and
@ is % & ~ ? % h @ & ~ & , and is
defined on the unit square. The graph at the
right illustrates the region for the probability
%
in question. 7 @ ? ~
& & % ~ . U
6
Example 121: Continuous random variables ? and @ have a joint distribution with density
function % & ~ % b for % and & .
Find 7 ? O@ .
%&
7 ? q@
.
7 @
%&
7 ? q @ ~
% b & % ~
.
%&
7 @ ~
@ & & ~
% & % & ~
% b % & ~
S
7 ? O@ ~
~
U
.
Solution: 7 ? O@ ~
Example 122: Continuous random variables ? and @ have a joint distribution with density
function % & ~ &c% for % B and & .
Find 7 ? O@ ~ .
7 ?q@ ~
Solution: 7 ? O@ ~ ~
.
@
B
B
7 ? q @ ~ ~
% % ~
c% % ~ h c .
B
B
@ ~
% % ~
c% % ~
S 7 ? O@ ~ ~ c .
U
Then, @ & ~
% & % ~ & b
%b&
h % b ~ % b & .
%b
Example 124: Find *#? @ for the jointly distributed discrete random variables in
Example 118 above.
Solution: *#? @ ~ ,?@ c ,? h ,@ . In Example 118 it was found that
,?@ ~
. The marginal probability function for ? is 7 ? ~ ~
b
b
~
,
7 ? ~ ~
and 7 ? ~ c ~ , and the mean of ? is
,? ~
b
b c ~
.
In a similar way, the probability function of @ is found to be 7 @ ~ ~ 7 @ ~ ~
and 7 @ ~ c ~
with a mean of ,@ ~ c
.
Then, *#? @ ~
c
c
~
U
.
Example 126: Suppose that ? and @ are random variables whose joint distribution has moment
generating function 4 ! ! ~ ! b ! b , for all real ! and ! .
Find the covariance between ? and @ .
Solution: *#? @ ~ ,?@ c ,? h ,@ .
,?@ ~ C !C C ! 4?@ ! ! e
! ~! ~
! ~! ~
! ~! ~
S *#? @ ~
c ~ c
. U
elsewhere). Then ,@ ~
& h c & & ~ , ,@ ~
& h c & & ~
,
and = @ ~ ,@ c ,@ ~
c ~
.
%
, ,@ O? ~ , ?
~ h %% ~ ~ ,@ .
In a similar way, = @ O? ~ % ~ ,@ O? ~ % c ,@ O? ~ % ,
%
where ,@ O? ~ % ~
& h & ~ % , so that ,@ O? ~ ? , and since
%
?
?
?
?
,@ O? ~ , we have = @ O? ~ c ~ .
%
Then , = @ O? ~ , ?
~ h % % ~ , and
= ,@ O? ~ = ?
~ = ? ~ h ,? c ,? ~ h c ~
so that , = @ O? b = ,@ O? ~
b
~
~ = @ . U
is strictly decreasing, such as "% ~ c% ). As a one-to-one function, " has an inverse function
#, so that #"% ~ % . Then the random variable @ ~ "? (@ is referred to as a
transformation of ?) has p.d.f. @ & found as follows: @ & ~ ? #& h O#Z &O . If " is a
strictly increasing function, then
-@ & ~ 7 @ & ~ 7 "? % ~ 7 ? #& ~ -? #& .
Distribution of a function of a discrete random variable ? : Suppose that ? is a discrete
random variable with probability function %. If "% is a function of %,
and @ is a random variable defined by the equation @ ~ "? , then @ is a discrete random
variable with probability function & ~ % - given a value of &, find all values of % for
&~"%
which & ~ "% (say "% ~ "% ~ ~ "%! ~ &, and then & is the sum of those
% probabilities.
If ? and @ are independent random variables, and " and # are functions, then the random
variables "? and #@ are independent.
% ~
% ~
10
(iii) If ? and ? are continuous random variables with joint density function % %
B
then the density function of @ ~ ? b ? is @ & ~
% & c % % .
cB
~
~
~
~~b
~
~
~
~
~ ~
*# ? b @ b ~ *#? @
(vi) The Central Limit Theorem: Suppose that ? is a random variable with mean
and standard deviation and suppose that ? ? ? are independent random
variables with the same distribution as ? . Let @ ~ ? b ? b b ? . Then
,@ ~ and = @ ~ , and as increases, the distribution of @ approaches
a normal distribution 5 . This is a justification for using the normal
distribution as an approximation to the distribution of a sum of random variables.
(vii) Sums of certain distributions: Suppose that ? ? ? are independent
~
distribution of @
binomial )
binomial )
geometric
negative binomial
negative binomial
negative binomial
Poisson
Poisson
5
5
11
Example 128: The random variable ? has an exponential distribution with a mean of 1. The
random variable @ is defined to be @ ~ ? . Find @ &, the p.d.f. of @ .
Solution: -@ & ~ 7 @ & ~ 7 ? & ~ 7 ? & ~ c c
&
&
S @ & ~ -@Z & ~ &
c c ~ & h c .
&
Example 129: Suppose that ? and @ are independent discrete integer-valued random variables
with ? uniformly distributed on the integers to , and @ having the following probability
function - @ ~ , @ ~ , @ ~ . Let A ~ ? b @ .
Find 7 A ~ .
Solution: Using the fact that ? % ~ for % ~ , and the convolution method for
independent discrete random variables, we have
A ~ ? h @ c
~
~ b b b b b ~ .
Example 130: ? and ? are independent exponential random variables each with a mean of 1.
Find 7 ? b ? .
Solution: Using the convolution method, the density function of @ ~ ? b ? is
&
&
@ & ~
? ! h ? & c ! ! ~
c! h c&c! ! ~ &c& , so that
&~
~ c c
&~
7 ? b ? ~ 7 @ ~
&c& & ~ c &c& c c& e
U
Example 131: Given independent random variables ? ? ? each having the same
variance of , and defining < ~ ? b ? b b ?c and
= ~ ? b ? b b ? , find the coefficient of correlation between < and = .
; < ~ b b b b ~ b ~ = .
Since the ? 's are independent, if then *#? ? ~ . Then, noting that
*#> > ~ = > , we have
*#< = ~ *#? ? b *#? ? b b *#?c ?
Solution: <= ~
*#< =
< =
U
12
Example 132: Independent random variables ? @ and A are identically distributed. Let
> ~ ? b @ . The moment generating function of > is 4> ! ~ b !
.
Find the moment generating function of = ~ ? b @ b A .
Solution: For independent random variables,
4?b@ ! ~ 4? ! h 4@ ! ~ b !
. Since ? and @ have identical
distributions, they have the same moment generating function. Thus,
4? ! ~ b ! , and then 4= ! ~ 4? ! h 4@ ! h 4A ! ~ b !
.
Alternatively, note that the moment generating function of the binomial ) is
c b ! . Thus, ? b @ has a )
distribution, and each of ? @ and A has
a ) distribution, so that the sum of these independent binomial distributions is
)
, with m.g.f. b !
.
U
Example 133: The birth weight of males is normally distributed with mean 6 pounds, 10
ounces, standard deviation 1 pound. For females, the mean weight is 7 pounds, 2 ounces with
standard deviation 12 ounces. Given two independent male/female births, find the probability
that the baby boy outweighs the baby girl.
Solution: Let random variables ? and @ denote the boy's weight and girl's weight, respectively.
Then, > ~ ? c @ has a normal distribution with mean
c
~ c lb. and variance ? b @ ~ b
~
Then, 7 ? @ ~ 7 ? c @ ~ 7 >
> cc
cc
? ~ 7 A ,
where A has standard normal distribution (> was standardized). Referring to the
standard normal table, this probability is .
U
Example 134: If the number of typographical errors per page type by a certain typist follows a
Poisson distribution with a mean of
, find the probability that the total number of errors in 10
randomly selected pages is 10.
Solution: The 10 randomly selected pages have independent distributions of errors per page.
The sum of independent Poisson random variables with parameters
has a Poisson distribution with parameter
. Thus, the total number
of errors in the 10 randomly selected pages has a Poisson distribution with parameter
.
The probability of 10 errors in the 10 pages is
c
.
[
13