Académique Documents
Professionnel Documents
Culture Documents
1 Probability Space
We start by introducing mathematical concept of a probability space.
1
• Union: For any two events E and F , the event E ∪ F consists of all
outcomes that are either in E or in F , meaning that E ∪ F is realized if
either E or F occurs
• Countable union: If (EiS)i≥1 is a countable sequence of events, the union
∞
of these events denoted i=1 Ei is defined to be that event which consists
of all outcomes that are in Ei for at least one value of i ∈ N, i ≥ 1
• Intersection: For any two events E and F , the event E ∩ F consists of all
outcomes that are both in E and in F , meaning that E ∩ F is realized if
both E and F occur
• Associativity : E ∪ (F ∪ G) = (E ∪ F ) ∪ G and E ∩ (F ∩ G) = (E ∩ F ) ∩ G
• Distributivity : E ∪ (F ∩ G) = (E ∪ F ) ∩ (E ∪ G) and E ∩ (F ∪ G) =
(E ∩ F ) ∪ (E ∩ G)
• Transitivity : If E ⊆ F and F ⊆ G then E ⊆ G
2
Here, the event space is modelled as a σ-algebra on Ω.
Definition 1.3 (σ-algebra). A σ-algebra A ⊆ P(Ω) is a family of sets over Ω
satisfying the following properties :
(1) ∅ ∈ A
P(∅) = 0 (1)
3
• E1 = Ω
• Ei = ∅ for all i > 1
Since (Ei )i≥1 is a family of mutually exclusive events (Ω ∩ ∅ = ∅ and ∅ ∩ ∅ = ∅),
we have
∞ ∞ ∞
!
[ X X
1 = P(Ω) = P Ei = P(Ei ) = P(Ω) + P(∅)
i=1 i=1 i=2
P∞
Thus,
P∞ i=2 P(∅) = 0. Axiom (1) states that P(∅) ≥ 0. If P(∅) > 0, then
i=2 P(∅) = ∞. Therefore P(∅) = 0.
Property 1.2 (Probability of a finite union of mutually exclusive events). For
any finite sequence of events E1 , . . . , En ∈ A that are mutually exclusive (i.e.
Ei ∩ Ej = ∅ if i 6= j), !
[n Xn
P Ei = P(Ei ) (2)
i=1 i=1
Proof. Let us consider (Ei )i≥1 a countably infinite sequence of mutually exclu-
sive events such that Ei = ∅ for all i > n. We have that
∞ ∞ ∞
n
! ! n n
[ [ X X X X
P Ei = P Ei = P(Ei ) = P(Ei ) + P(∅) = P(Ei )
i=1 i=1 i=1 i=1 i=n+1 i=1
P(E) ≤ 1 (4)
4
Proof.
∞ ∞
!
[ [
F =F ∩Ω=F ∩ Ei = (F ∩ Ei )
i=1 i=1
P(E c ) = 1 − P(E)
Property 1.6 (Probability of the union of 2 arbitrary events). For any two
events E, F ∈ A,
Proof. Using a Venn diagram representation to get some intuition, we can write
E ∪ F as the union of mutually exclusive events F and E ∩ F c . Therefore,
P(E ∪ F ) = P(F ) + P(E ∩ F c ). According to the law of total probability,
P(E) = P(E ∩ F ) + P(E ∩ F c ). Hence, the result holds.
Property 1.7 (Inclusion-Exclusion Identity/ Poincaré’s formula). For any fi-
nite sequence of events E1 , . . . , En ∈ A,
n
! n
[ X X
P Ei = (−1)k−1 P(Ei1 ∩ . . . ∩ Eik ) (7)
i=1 k=1 1≤i1 <...<ik ≤n
P
where 1≤i1 <...<ik ≤n means the sum for all subsets of {1, . . . , n} of size k
Proof. By induction.
Example: For n = 3, the inclusion-exclusion identity turns out to be :
5
2 Uniform Probability Measure on Finite Sam-
ple Spaces
Let Ω be a finite sample space : Ω = {ω1 , . . . , ωn } with |Ω| = n ∈ N, n ≥ 1. In
that case, any subset of Ω will be an event, meaning that we will always consider
the σ-algebra A = P(Ω) for a finite sample space Ω. A probability measure P
on measurable space (Ω, P(Ω)) is fully characterized by the values P takes on
outcomes ωi . Indeed an event E can be written as :
[
E= {wi }
i∈J
where J ⊆ {1, . . . , n} is the set of indices of all the outcomes wi that compose
event E. Then, for any event E,
!
[ X
P(E) = P {ωi } = P ({ωi })
i∈J i∈J