Vous êtes sur la page 1sur 3

Our data consists of:

Height n =15 men

it’s expensive to collect data explaining the Variability for these heights

 Thus, we Assume the men’s heigh follow a Normal Distribution i.e

Yi = µ+ξi , ξi ~iid~ N(0,σ2) [epsilon follows a normal Dist]

2.we also assume these ξi are iid ~ (independent & identically distributed (from this normal
dist))

3. this is also for i=1,…,n

Equivalently we can write:

Yi ~iid~ N(0,σ2)

If we know the values µ & σ specifies the probability dist, (& model for the data)

This way we can gen more (fake) data [using the probability dist]

The frequentists approach:

Consider µ & σ FIXED, & Unknown , then we estimate them!

2. to calculate the Uncertainty:

(by saying) µ & σ –might CHANGES- if we repeat sampling process

& obtain another 15 ment for our sample

The Bayesian Approach:

2. to calculate the Uncertainty:

(by saying) µ & σ ARE RANDOM VARIABLES

- With (their own) probability [priors]


- And, They complete the Bayesian Model !

The Bayesian Model:

1. The Likelihood.
2. The Prior.
3. The Posterior.
Likelihood P(y |θ)

‫ن‬
‫أربعي‬ ‫يخلق من الشبه‬

Probabilistic Model (for data)

Given unknown factors (parameters):

Data is Generated!

It describes how!

It can be writted with a Prob

P(yIθ) [this Describes a Probability Dist.]

Y-> discreet, P(y) is a probability

Y-> Normal(n Continuous) : The Density (of the Distribution)

Prior ‫ما وقع سابقا‬

Probability (Distribution)

Characterizes our Uncertainty

With a Parameter

Prior + Likelihood = Joint Probability for:

a. The Data. [The Known]


b. The (Unknown) Parameters.

How Come? (Proof):

By using the Chain Rule (of Probability).

If we want the joint Dist. of (BOTH) the Data and Parameters P(y,θ)

P(y,θ) = P(θ) * P(y|θ)

However, we’re going to Infer the Data

+ we Know y

= we don’t need the Joint Distribution here

 We need the Posterior Distribution.


Posterior Distribution P( θ|y)

This is Obtained using the laws of Conditional Probability [Bayes Theorem]

𝑷( 𝜽, 𝒚)
𝑷( 𝜽|𝒚) =
𝑷(𝒚)

Whereas, P(y)= 𝑷(𝒚) = ∫ 𝑷( 𝜽|𝒚)𝒅𝜽[the marginal distribution of y]

𝑷( 𝜽, 𝒚)
𝑷( 𝜽|𝒚) =
∫ 𝑷( 𝜽|𝒚)𝒅𝜽

𝑷( 𝒚|𝜽)𝑷(𝜽)
𝑷( 𝜽|𝒚) =
∫ 𝑷(𝒚|𝜽)𝒑(𝜽)𝒅𝜽

Summations theta is a discreete random variable

Vous aimerez peut-être aussi