Vous êtes sur la page 1sur 26

Bayesian Statistical

Inference
Recitation 12
EEE 25: Probability and Statistics for Electrical and Electronics Engineers

Recitation 12: Bayesian Statistical Inference


EEE 25
Probability and Statistics

• Probability
- Specified probabilistic model
- e.g. Uniform distribution, Gaussian distribution, etc.
- Basically, everything we’ve discussed so far.
• Statistical Inference
- Extracting information about an unknown variable or an
unknown model from available data.
• Schools of Thought
- Bayesian – unknown as RVs
- Classical (frequentist) – unknown as a constant

Recitation 12: Bayesian Statistical Inference


EEE 25 2
Bayesian Statistical Inference

• Main objective in an estimation problem is to


arrive at an optimal estimate of θ.
• Some terms
- Θ = parameter of interest (an RV)
- X = collection of observations or measurements
- fΘ(θ) or pΘ(θ) = prior distribution
- fX|Θ(x|θ) or pX|Θ(x|θ) = conditional distribution
- fΘ|X(θ|x) or pΘ|X(θ|x) = posterior distribution

Recitation 12: Bayesian Statistical Inference


EEE 25
Problem 01 – Bayesian Inference

• Romeo and Juliet start dating, but Juliet will be late


on any date by a random amount X, uniformly
distributed over the interval [0, θ]. The parameter
θ is unknown and is modeled as the value of a
random variable Θ, uniformly distributed between
zero and one hour.
• Assuming that Juliet was late by an amount x on
their first date, how should Romeo use this
information to update the distribution of Θ?

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 4
Problem 01 – Bayesian Inference

• Get the prior PDF


1, 𝑖𝑓 0 ≤ 𝜃 ≤ 1
𝑓Θ 𝜃 = ቊ
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Get the conditional PDF of the observation
1ൗ , 0 ≤ 𝑥 ≤ 𝜃
𝑓𝑋|Θ 𝑥 𝜃 = ൝ 𝜃
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 5
Problem 01 – Bayesian Inference

• Get the posterior PDF (Exercise)


- Recall Bayes’ rule
𝑓Θ (𝜃)𝑓𝑋|𝜃 (𝑥|𝜃)
𝑓Θ|𝑋 𝜃𝑥 =
‫𝑓 ׬‬Θ 𝜃 ′ 𝑓𝑋|Θ 𝑥 𝜃 ′ 𝑑𝜃′
- Numerator is the joint PDF of X and Θ.
- Denominator is the marginal PDF of X.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 6
Problem 01 – Bayesian Inference

• Get the posterior PDF


- Numerator is nonzero only if 0 ≤ x ≤ θ ≤ 1

𝑓Θ (𝜃)𝑓𝑋|Θ (𝑥|𝜃)
𝑓Θ|𝑋 𝜃𝑥 = ′ ′
‫𝑓 ׬‬Θ 𝜃 𝑓𝑋|Θ 𝑥 𝜃 𝑑𝜃′
1ൗ
= 𝜃
1 1
‫𝑥׬‬ 𝑑𝜃′
𝜃′
1
=
𝜃 ln 𝑥

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 7
Problem 01 – Bayesian Inference

• Get the posterior PDF


- Numerator is nonzero only if 0 ≤ x ≤ θ ≤ 1

1
, 𝑥 ≤𝜃 ≤1
𝑓Θ|𝑋 𝜃 𝑥 = ቐ𝜃 ln 𝑥
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 8
Problem 01 – Bayesian Inference

• How should Romeo update the distribution of Θ if


he observes that Juliet is late by x1, … , xn on the
first n dates? Assume that Juliet is late by a
random amount X1, … , Xn are uniformly distributed
between zero and θ and are conditionally
independent.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 9
Problem 01 – Bayesian Inference

• Get the conditional PDF

1ൗ 𝑛 , 𝑥ҧ ≤ 𝜃 ≤ 1
𝑓𝑋|Θ 𝑥𝜃 = ൝ 𝜃
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

𝑤ℎ𝑒𝑟𝑒 𝑥ҧ = max{𝑥1 , … , 𝑥𝑛 }

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 10
Problem 01 – Bayesian Inference

• Get the posterior PDF

𝑐(𝑥)ҧ
𝑓Θ|𝑋 𝜃 𝑥 = ቐ 𝜃𝑛 , 𝑥ҧ ≤ 𝜃 ≤ 1
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

• where 𝑐(𝑥)ҧ is a normalizing constant that depends


only on 𝑥.ҧ
1
𝑐 𝑥ҧ =
1 1
‫𝑥׬‬ҧ (𝜃 ′ )𝑛 𝑑𝜃′

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 11
Problem 02 – MAP Estimator

• Find the MAP estimate of Θ based on the


observation X = x.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 12
Problem 02 – MAP Estimator

• Maximum a Posteriori Probability (MAP) Estimator

𝜃መ𝑀𝐴𝑃 = argmax 𝑓Θ|𝑋 (𝜃|𝑥)


𝜃
1
= argmax
𝜃 𝜃 ln 𝑥

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 13
Problem 02 – MAP Estimator

• Maximum a Posteriori Probability (MAP) Estimator


- Consider the plot of posterior PDF at various values of x

Maximum value of
PDF at θ = x

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 14
Problem 03 – LMS Estimator

• Find the LMS estimate of Θ based on the


observation X = x.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 15
Problem 03 – LMS Estimator

• Least Mean Squares (Conditional Expectation)


Estimator.
- Goal is to minimize 𝐸[ Θ − 𝜃 2 ].
- Recall from DC 08, the least mean squared error is
computed as 𝐸[Θ].
- For any given value of observation x of X.

𝜃መ𝐿𝑀𝑆 = 𝐸 Θ|𝑋 = 𝑥

= න 𝜃𝑓Θ|𝑋 𝜃 𝑥 𝑑𝜃
𝜃

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 16
Problem 03 – LMS Estimator

• Least Mean Squares (Conditional Expectation)


Estimator.
𝜃መ𝐿𝑀𝑆 = 𝐸 Θ|𝑋 = 𝑥

= න 𝜃𝑓Θ|𝑋 𝜃 𝑥 𝑑𝜃
𝜃
1
1
=න 𝜃 𝑑𝜃
𝑥 𝜃 ln 𝑥
1 −𝑥
=
ln 𝑥

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 17
Problem 04 – MAP vs LMS

• Calculate the conditional mean squared error for


the MAP and the LMS estimates. Compare your
results.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 18
Problem 04 – MAP vs LMS

• Calculate the conditional mean squared error.


1
2 2 1
𝐸 Θ − 𝜃መ 𝑋 = 𝑥] = න 𝜃 − 𝜃መ 𝑑𝜃
𝑥 𝜃 ln 𝑥
1
1
= න (𝜃 − 2𝜃𝜃መ + 𝜃 )
2 መ2 𝑑𝜃
𝑥 𝜃 ln 𝑥
2
2 1 − 𝑥 1 − 𝑥
= 𝜃መ 2 − 𝜃መ +
ln 𝑥 2 ln 𝑥

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 19
Problem 04 – MAP vs LMS

• Calculate the conditional mean squared error for


the MAP and the LMS estimates.
• For the MAP Estimate, 𝜃መ𝑀𝐴𝑃 = 𝑥
2
2 3𝑥 − 4𝑥 + 1
𝐸 Θ − 𝜃መ𝑀𝐴𝑃 2
𝑋 = 𝑥] = 𝑥 +
2| ln 𝑥 |
1 −𝑥
• For the LMS Estimate, 𝜃𝐿𝑀𝑆 =

ln 𝑥
2 2
2 1−𝑥 1−𝑥
𝐸 Θ − 𝜃መ𝐿𝑀𝑆 𝑋 = 𝑥] = −
2| ln 𝑥| ln 𝑥

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 20
Problem 04 – MAP vs LMS

• Compare your results

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 21
Problem 05 – Linear LMS

• Derive the linear LMS estimator of Θ based on X.


• Calculate the conditional mean squared error for
the linear LMS estimate. Compare your answer to
the results of problem 04.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 22
Problem 05 – Linear LMS

• Linear Least Mean Squares Estimator


- Linear approximation of LMS Estimator
- May yield simpler calculations
- Only involves the mean, variance, and covariance of Θ
and X.
𝜎Θ
෡ = 𝐸 Θ + 𝜌 (𝑋 − 𝐸 𝑋 )
Θ
𝜎𝑋
𝑐𝑜𝑣(Θ, 𝑋)
𝜌=
𝜎Θ 𝜎𝑋

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 23
Problem 05 – Linear LMS

• Derive the linear LMS estimator of Θ based on X.


6 2
Θ෡ = 𝑋+
7 7
• Calculate the conditional mean squared error for
the linear LMS estimate.
2
2 2 1−𝑥 1−𝑥
𝐸[ 𝜃መ − Θ |𝑋 = 𝑥] = 𝜃መ − 𝜃መ
2
+
ln 𝑥 2 ln 𝑥
6 2
where 𝜃መ = 𝑥 +
7 7

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 24
Problem 05 – Linear LMS

• Compare your answer to the results in problem 04.

Recitation 12: Bayesian Statistical Inference


EEE 25
• Taken from Introduction to Probability by
Bertsekas and Tsitsiklis, 2nd edition, 2008 25
Bayesian Statistical
Inference
Recitation 12
EEE 25: Probability and Statistics for Electrical and Electronics Engineers

Recitation 12: Bayesian Statistical Inference


EEE 25

Vous aimerez peut-être aussi