Académique Documents
Professionnel Documents
Culture Documents
h
n
Thus
p+q=1 , or
nh
h
=1 =1 p=1Pr {E }
n
n
Pr {E }+ Pr {not E }=1 .
E
, E or
E .
Note that the probability of an event is a number between 0 and 1. If the event
cannot occur, its probability is 0. If it must occur, i.e. its occurrence is certain, its
probability is 1.
If
p is the probability that an event will occur, the odds in favour of its
happening are
2 1
q : p= : =2 :1
, i.e. 2 to
3 3
E1
and
E2
conditional probability of
E2
given that
E1
E2
then
E2
has occurred.
E1
Pr {E 2E1 }=Pr {E 2 }
E1
and
E2
are
E1 E 2
E1
and
E2
occur, sometimes
(1)
In particular,
Pr {E 1 E2 }=Pr { E1 }Pr {E 2 }
E1
E2
E3
(2)
we have
E2
E1
and
E1
E2
given that
E2
E1
probabilities
and
E3
E1 , E 2 , E3 , , E n
p1 , p2 , p3 , , p n
and .,
En
is
In general if
E3
and
(3)
(4)
p1 p2 p 3 pn
E1
and
E2
E 1 + E2
Pr {E 1 E2 }=0
E1
and
E2
E2
are mutually
E1
or
Pr { E 1+ E 2 }=Pr { E 1 } + Pr { E2 } Pr {E1 E2 }
(5)
In particular,
Pr { E 1+ E 2 }=Pr { E 1 } + Pr { E2 }
E 1 , E 2 , , En
As an extension of this, if
E1
or
E2
En
or
is
(6)
p1 , p2 , , pn
p1+ p 2+ + pn
If a variable
respective probability
p1 , p2 , , p K
where
X 1 , X 2 , , X K
p1+ p 2+ + pK =1
X =X 1 , X 2 , , X K
X . Because
with
given probability, it is often called a discrete random variable. A random variable is also
known as a chance variable or stochastic variable.
Note that this is analogous to a relative frequency distribution with probability
replacing relative frequencies. Thus we can think of probability distributions as theoretical
or ideal limiting forms of relative frequency distributions when the number of observations
for populations, whereas relative frequency distributions are distributions of sample drawn
from this population.
p(X)
are analogous to cumulative relative frequency distributions. The function associated with
this distribution is sometimes called a distribution function.
may assume a
X =a and
gives the probability that X lines between a and b, which can be denoted by
Pr {a< X <b } .
We call
when such a function is given we say that a continuous probability distribution for
has been defined. The variable
As in the discrete case, we can define cumulative probability distributions and the
associated distribution functions.
MATHEMATICAL EXPECTATION
p is the probability that a person will receive a sum of money S , the
If
p1 , p2 , , p K
where
X 1 , X 2 , , X K
p1+ p 2+ + pK =1
denotes a discrete
with respective
E ( X )= p1 X 1 + p2 X 2 ++ p K X K = p j X j = pX
(7)
j=1
If the probability
f j/ N
mean
pj
where N= f j
probability
fX
, the expectation reduces to /N
or a sample of size
frequencies. As
pj
in which
X 1 , X 2 , , X K
f j/ N
approach the
population from which the sample is drawn. If we call m the sample mean we can
denote the population mean by the corresponding Greek letter
(mu).
Expectation can also be defined for continuous random variables, but the definition
requires use of the calculus.
such samples are equally probable), then it is possible to show that the expected value of
the sample mean m is the population means .
If does not follow, however, that the expected value of any quantity computed from
a sample is the corresponding population quantity. For example, the expected value of the
sample variance as we have defined it is not the population variance ( N1)/ N
times
this variance. This is why some statisticians choose to defined the sample variance as our
variance multiplied by
N /(N 1) .
COMBINATORIAL ANALYSIS
In obtaining probabilities of complex events an enumeration of cases is often
difficult, tedious, or both. To facilitate the labour involved, use in made of basic principles
studied in a subject called combinatorial analysis.
FUNDAMENTAL PRINCIPLE
n1
n2
n1 n2
FACTORIAL
(8)
PERMUTATIONS
A permutation of
r
at a time is an arrangement of
out of the n objects with attention given to the order of arrangement. The number
Pr
P(n , r ) or
and is given by
n!
( nr )!
where
n=n1 +n2 +
(9)
at a time is denoted by
Cr
, C( n ,r ) ,
n
r
and given by
or
Cr=
P
n (n1) (nr +1)
n!
=
=n r
r!
r ! (nr)! r !
(11)
We have
C r =n C nr
. Thus
20
C 17=20 C 3=
20.19 .18
=1140
. The number of
3!
C 1 + n C 2 ++ n C n=2n1 .
STIRLINGS APPROMAXIMATION TO n !
When n is large a direct evaluation of
2 n nn en
(12)
E1
E2
or
in Fig.
E1 + E2
E1
or
and
E2
E2
E1 E 2
the sum of the probabilities associated with all points contained in the set
the probability of
E1 + E2
is the set of
, denoted by
Pr {E 1+ E 2 }
E 1 + E2
E1
E1
is
. Similarly
E1
and
E2
have no
The set
two sets. The set
E 1 + E2
is sometimes denoted by
E1 E 2
E1 E 2
is sometimes denoted by
E 1 E2
intersection ot the two sets. Extensions to more than two sets can be made. Thus instead of
E 1 + E2 + E 3
and
E 1 E2 E 3
E 1 E 2 E3
E1 E 2 E3
and
respectively.
Pr { }=0 . If
E1
and
E2
, which means that the corresponding events are mutually exclusive and
Pr {E 1 E2 }=0
With this modern approach, a random variable is a function defined at each point of
the sample space. For example, in Problem 6.37 the random variable is the sum of the coordinates of even point.
In the case where S