Vous êtes sur la page 1sur 2

Reference Formula for EE5302

1. Binomial Random Variable:


S
X
= {0, 1, 2, , n}; p
k
= C
k
n
p
k
(1 p)
nk
, k = 0, 1, 2 , n; E[X] = np, V AR[X] = np(1 p).
2. Geometric Random Variable (First Version):
S
X
= {0, 1, 2, , }; p
k
= p(1 p)
k
, k = 0, 1, 2, ; E[X] =
1p
p
, V AR[X] =
1p
p
2
,
3. Poisson Random Variable:
S
X
= {0, 1, 2, }; p
k
=

k
k!
e

, k = 0, 1, ; E[X] = , V AR[X] = .
4. Uniform Random Variable:
f
X
(x) =
1
ba
, a x b; E[X] =
a+b
2
, V AR[X] =
(ba)
2
12
.
5. Exponential Random Variable:
f
T
(t) = e
t
, t 0; E[T] =
1

, V AR[T] =
1

2
.
6. m-Erlang Random Variable:
f
Sm
(y) =
(y)
m1
(m1)!
e
y
, y 0; E[S
m
] =
m

, V AR[T] =
m

2
.
7. Conditional Probability:
P[A|B] =
P[AB]
P[B]
, for P[B] > 0.
8. The Theorem on Total Probability:
Let B
1
, B
2
, . . . , B
n
be mutually exclusive events whose union equals to the sample space S. Then for any event A,
we have P[A] = P[B
1
]P[A|B
1
] +P[B
2
]P[A|B
2
] + +P[B
n
]P[A|B
n
].
9. Bayes Rule:
Let B
1
, B
2
, . . . , B
n
be mutually exclusive events whose union equals to the sample space S. Then for any event A,
we have
P[B
j
|A] =
P[A|B
j
]P[B
j
]

n
i=1
P[B
i
]P[A|B
i
]
10. Function of a random variable:
Let Y = g(X), the pdf of the random variable X be f
X
(x). Then the pdf of the random variable Y is
f
Y
(y) =

k
f
X
(x)
|dy/dx|

x=x
k
=

k
f
X
(x)

dx
dy

x=x
k
11. Variance of a random variable:
V AR[X] = E[(X E[X])
2
] = E[X
2
] E[X]
2
.
12. Two Jointly Continuous Random Variables:
f
X,Y
(x, y) =

2
F
X,Y
(x, y)
xy
f
X
(x) =
_
+

f
X,Y
(x, y)dy
F
Y
(y|x) =
_
+

f
X,Y
(x, y)dy
f
X
(x)
f
Y
(y|x) =
d
dy
F
Y
(y|x) =
f
X,Y
(x, y)
f
X
(x)
E[Y ] = E[E[Y |X]]
13. Transformation of Two Random Variables:
Let V = g
1
(X, Y ) and W = g
2
(X, Y ). Let the inverse transform be X = h
1
(V, W) and Y = h
2
(V, W). Assume the
joint pdf of X and Y is f
X,Y
(x, y). Then the joint pdf of V and W is
f
V,W
(v, w) =

i,j
f
X,Y
(h
1
(v, w), h
2
(v, w))
|J
X,Y
(x, y)|

v=v
i
,w=w
j
=

i,j
f
X,Y
(h
1
(v, w), h
2
(v, w))|J
V W
(v, w)|

v=v
i
,w=w
j
where
J
X,Y
(x, y) =

v
x
v
y
w
x
w
y

and J
V,W
(v, w) =

x
v
x
w
y
v
y
w

14. Correlation and Covariance of Two Random Variables:


COV (X, Y ) = E[XY ] E[X]E[Y ],
X,Y
=
COV (X,Y )
XY
=
E[XY ]E[X]E[Y ]
XY
15. Jointly Gaussian Random Variables:
f
X,Y
(x, y) =
1
2
X

Y
_
1
2
X,Y
e

1
2(1
2
X,Y
)
_
_
xmx

X
_
2
2X,Y
_
xmx

X
__
ymy

Y
_
+
_
ymy

Y
_
2
_
16. Linear Estimation Using Mean Square Criterion:

Y =
X,Y

Y
_
X E[X]

X
_
+E[Y ]
17. Central Limit Theorem:
Let S
n
be the sum of n iid random variables with nite mean E[X] = and nite variance
2
, and let Z
n
be the
zero-mean, unit variance random variable dened by Z
n
=
Snn

n
, then
lim
n
P[Z
n
z] =
1

2
_
z

x
2
2
dx
18. Autocorrelation and Autocovariance functions of a random process:
R
X
(t
1
, t
2
) = E[X(t
1
)X(t
2
)], C
X
(t
1
, t
2
) = E[(X(t
1
) m
X
(t
1
))(X(t
2
) m
X
(t
2
))]
19. Poisson Random Process:
P[N(t) = k] =
(t)
k
k!
e
t
for k = 0, 1, ; C
N
(t
1
, t
2
) = min(t
1
, t
2
).
20. Power Spectral Density for a Random Processes:
S
X
(f) = F(R
X
()), and R
X
() = F
1
(S
X
(f)).
21. Response of Lnear Systems to Random Signals:
S
Y
(f) = |H(f)|
2
S
X
(f) and S
Y,X
(f) = H(f)S
X
(f).
22. Optimum Filter:
Let X
t
and Z
t
be a discrete-time, zero-mean, jointly wide-sense stationary processes, and Y
t
be a minimum mean
square estimate for Z
t
of the form
Y
t
=
t+b

=ta
h
t
X

=
a

=b
h

X
t
(a > b)
Then
R
Z,X
(m) =
a

=b
h

R
X
(m) for b m a
from which we can obtain the coecients h
i
(i = b, b + 1, , a).
The mean square error for this estimate is
E[e
2
t
] = R
Z
(0)
a

=b
h

R
Z,X
()
Particularly, when X

= Z

+N

, where N

and Z

are independent random processes. Then


R
Z
(m) =
a

=b
h

(R
Z
(m) +R
N
(m) for b m a
When b = 1, it is a predictor; when b = 0, it is a lter; when b = + and a = +, it is a smoother. For a smoother,
we have
H(f) =
S
Z
(f)
S
Z
(f) +S
N
(f)

Vous aimerez peut-être aussi