Académique Documents
Professionnel Documents
Culture Documents
Chennai – 130
B.E./B.Tech. DEGREE EXAMINATIONS, NOV/DEC 2011
Fourth Semester
Common to ECE/BIOMEDICAL
MA6453 – PROBABILITY AND RANDOM PROCESSES
(Regulations 2008)
Time: Three hours Maximum: 100 marks
Answer all question
𝟎𝟎, 𝒙𝒙 < 𝟎𝟎
1. The CDF of a continuous random variable is given by F(x)= � −𝒙𝒙 Find
𝟏𝟏 − 𝒆𝒆 , 𝟎𝟎 ≤ 𝒙𝒙 < ∞
𝟓𝟓
F(x)=�1 − 𝑒𝑒 5 , 𝑥𝑥 ≥ 0
0, 𝑥𝑥 < 0
The PDF of X is f(x)=𝐹𝐹1 (𝑥𝑥)
−𝑥𝑥
1
𝑒𝑒 5 , 𝑥𝑥 ≥ 0
=� 5
0 , 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
−𝑥𝑥
∞ ∞ 𝑥𝑥
Mean of X =E(X)=∫−∞ 𝑥𝑥𝑥𝑥(𝑥𝑥)𝑑𝑑𝑑𝑑 = ∫0 𝑒𝑒 5 𝑑𝑑𝑑𝑑
5
−𝑥𝑥 −𝑥𝑥 ∞
1 𝑒𝑒 5 𝑒𝑒 5
= �𝑥𝑥 −1 − 1. −1 2
�
5 �5�
5
0
−𝑥𝑥 ∞
=-�𝑒𝑒 (𝑥𝑥 + 5)� 5
0
=-[𝑒𝑒 −∞ − 𝑒𝑒 0 . 5]
=5
2. If X is a normal random variable with mean zero and variance σ 2 , find the pdf of Y = e X
Solution: Given
=
f ( x, y ) {e −( x + y )
x > 0, y > 0
= 0 0therwise
The M.d.f of X is = ∫ e − ( x + y ) dy
0
∞
∞
= e − x ∫ e − y dy= e − x −e − y = e − x −e −∞ + e0 = e − x [ 0 + 1]= e − x
0
o
∞
f (y) = ∫
−∞
f ( x, y )dx
The M.d.f of Y is = ∫ e − ( x + y ) dx
0
∞
∞
= e − y ∫ e − x dx= e − y −e − x = e − y −e −∞ + e0 = e − y [ 0 + 1]= e − y
0
o
).f(y) e −=
f ( x= x −y
.e −( x + y )
e= f ( x, y )
Now
Solution:
3x + 2 y = 26 (1)
Given 6x + y = 31 (2)
26 3
(1) ⇒ y = − x
2 2
3
bxy = − (3)
2
The regression coefficient of Y on X is
31 1
(2) ⇒ x = − y
6 6
3
byx = − (4)
2
−1 −3 1
Hence the regression coefficient of X on Y is From (3) & (4) bxy.byx=
= <1
6 2 4
1 1 1
r 2 =bxy.byx = ⇒ r =− =−
4 4 2
6. If {X(t)} is a normal process with µ(t)=10 and C(t 1 ,t 2 )=16 𝒆𝒆−|𝒕𝒕𝟏𝟏 −𝒕𝒕𝟐𝟐 | find the variance
of X(10)-X(6)
7. Given that the autocorrelation function for a stationary ergodic process with no
4
periodic components is Rxx (τ= ) 25 + . Find the mean and the variance of the
1 + 6τ 2
process { X (t )} .
µ x2 = lim Rxx (τ )
τ →∞
= 25
∴ µx = 5 .
E { X 2 (t )} = Rxx (0) = 25 + 4 = 29
Var { X (t )} = E { X 2 (t )} − ( E { X (t )} ) = 29 − 25 = 4.
2
∴
τ
Is an even function of .
(τ ) RXX (−τ )
RXX=
To prove that
Solution: Property 1:
Property 2.
∞
If the input X(t) and its output Y(t) are related=
by Y (t ) ∫ h(u ) X (t − u )du, then the system
−∞
1
P( X= x=
) , =
x 1, 2,..........∞ ,Find (1) mean of X (2) P(X is even) (3) P(X is divisible by
2x
3)
Solution:
= x 1 = x 1= 2 x 1 2
2
et et
=+ + ........
2 2
et et et
2
= 1 + + + ......
2 2 2
−1
et et et
( 2 et ) et
−1
= 1 − = t =− (1)
2 2 2−e
−et ( 2 − et ) ( −e ) + ( 2 − e )
−2 t −1`
M X' (t ) = t
et
= e2t (2 − e ) + (2 − e ) e
t −2 t −1 t
(2)
Mean = µ1' = M X (0) = 1 + 1 = 2
P( X =
even) =P( X =
2) + P ( X =
4) + ........
2
1 1
2 4
1 1 2 1
= + + ....... = = 4 =
2 2
2
1 1 3
1− 1−
2 4
P( X is divisible by 3) =
P( X =+3) P ( X = 6) + ............∞
3 6
1 1
= + + ...............∞
2 2
3
1
= 3 = × =
2 1 8 1
1 8 7 7
1−
2
k
=f ( x) , −∞ < x < ∞
11(a)(ii) A continuous R.V.X has the pdf 1 + x . Find
2
=0 , otherwise
π π 1
k + =1 ⇒ kπ =1 ⇒ k =
2 2 π
1 1
=
∴ f ( x) . , −∞ < x < ∞
π 1 + x
2
=0 otherwise
To find F(x)
1 −1 π
= tan x + ( A)
π 2
To find P(X>0)
1 π 1 1
P ( X > 0) =1 − 0 + =1 − = [ In ( A ) put x =0 ]
π 2 2 2
11(b)(i) Let X and Y be independent normal variates with mean 45 and 44 and standard
deviation 2 and 1.5 respectively.What is the probability that randomly chosen values of X
and Y differ by1.5 or more.
Solution:
X is N(45,2) and Y is N(44,1.5) .
Hence by the property of additive U= X − Y follows the distribution
N (1, 4 + 2.25) ie., N (1, 2.5)
P[ X & Y differ by 1.5 or more]
= P X − Y ≥ 1.5= P[U ≥ 1.5]
=1 − P U ≤ 1.5 =1 − P [ −1.5 ≤ U ≤ 1.5]
−1.5 − 1 U − 1 1.5 − 1
=
1− P ≤ ≤
2.5 2.5 2.5
=1 − P [ −1 ≤ Z ≤ 0.2] =1 − [ P(0 ≤ z ≤ 1 + P(0 ≤ z ≤ 0.2]
1 − [ 0.3413 + 0.0793] =
= 0.5794
11(b)(ii) If X is a uniform random variable in the interval (-2,2), find the probability density
function Y = X & E[Y ].
1
=
f X ( x) , −2< x < 2
2 − (−2)
=0 otherwise
1
f X ( x=
) , −2< x < 2
4
=0 otherwise (1)
Pdf of
dx1 dx 1 1 2 1
fY ( y ) = f X ( x1 ) + 2 f X ( x2 ) = 1. + 1. = = , 0 < y < 2
dy dy 4 4 4 2
Pdf of is
a+b 0+2
=
Mean = = 1
2 2
12(a)(i) The joint p.df of the two dimensional random variable (X,Y) is given by
8𝑥𝑥𝑥𝑥
𝑓𝑓(𝑥𝑥, 𝑦𝑦) = � 9 1 ≤ 𝑥𝑥 ≤ 𝑦𝑦 ≤ 2. Find
0 , 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
(i) Marginal densities of X and Y
(ii) The conditional density functions 𝑓𝑓(𝑥𝑥/𝑦𝑦) and 𝑓𝑓(𝑦𝑦/𝑥𝑥)
Solution :
(i) The Marginal density functions of X and Y are given by
2
𝑓𝑓𝑋𝑋 (𝑥𝑥) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
𝑥𝑥
2
8𝑥𝑥𝑥𝑥
=� 𝑑𝑑𝑑𝑑
𝑥𝑥 9
2
8 𝑥𝑥𝑦𝑦 2
= � �
9 2 𝑥𝑥
8𝑥𝑥 4 − 𝑥𝑥 2
= � �
9 2
4𝑥𝑥
= ⌊4 − 𝑥𝑥 2 ⌋ , 1 ≤ 𝑥𝑥 ≤ 2
9
𝑦𝑦
𝑓𝑓𝑌𝑌 (𝑦𝑦) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
1
𝑦𝑦
8𝑥𝑥𝑥𝑥
=� 𝑑𝑑𝑑𝑑
1 9
𝑦𝑦
8 𝑦𝑦𝑥𝑥 2
= � �
9 2 1
8𝑦𝑦 𝑦𝑦 2 − 1
= � �
9 2
4𝑦𝑦 2
= (𝑦𝑦 − 1) , 1 ≤ 𝑦𝑦 ≤ 2
9
Solution:
1 1 3
= 2(1) − x(1) − − ( 2(0) − x(0) − 0 ) = 2 − x − = − x ... (1)
2 2 2
∞ 1
1
x2
f (y)= ∫−∞ f ( x, y )dx=
0
∫ ( 2 − x − y )dx=
2 x −
2
− xy
0
1 1 3
= 2(1) − − (1) y − ( 2(0) − 0 − 0 ) = 2 − − y = − y ... ( 2 )
2 2 2
∞
3 3
1 1
E ( X 2 )= ∫x f ( x)dx= ∫x − x dx= ∫ 2 x − x 3 dx
2 2 2
−∞ 0 2 0
1
3 x x
3
1 1 1
4
= − = − = ...(5)
2 3 4 0 2 4 4
1
Similarly E (Y 2 ) = ...(6)
4
1 25 36 − 25 11
2
1 5
Var (X) =E[X 2 ] − [ E ( X ) ] = − = − = = = σX2
2
∞ ∞ 1 1
E (=
XY ) ∫∫
−∞ −∞
=
xyf ( x, y )dxdy ∫ ∫ xy(2 − x − y)dxdy
0 0
1 1
= ∫ ∫ (2 xy − x y − xy 2 )dxdy
2
0 0
1
1
2 x 2 y x3 y x 2 y 2
= ∫0 2 − 3 − 2 dy
0
1
2 13 y 12 y 2
= ∫0 1 y −
3
−
2
dy
1
1
y y2 y2
y 2 y3
= ∫0 y − 3 − 2 dy= 2 −
6
−
6 0
1 1 1 1 2 3− 2 1
= − − − (0 − 0 − 0) = − = =
2 6 6 2 6 6 6
1 5 5 1 25
Cov( X , Y ) =
E[ XY ] − E[ X ]E[Y ] = − . = −
6 12 12 6 144
1
= −
144
12(b)(i) If X 1 , X 2 ,...... X n , are uniform variates with mean 2.5 and variance ¾. Use central limit
theorem to estimate
{
f ( x) = e − x , x≥o
and
{
f ( y) = e− y , y≥o
=0 ,x < 0 = 0 ,y < 0
Solution:
The joint pdf of X and Y is 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = 𝑒𝑒 −𝑥𝑥 𝑒𝑒 −𝑦𝑦 = 𝑒𝑒 − (𝑥𝑥+𝑦𝑦) ; 𝑥𝑥, 𝑦𝑦 ≥ 0
Take 𝑢𝑢 = 𝑥𝑥 − 𝑦𝑦, 𝑣𝑣 = 𝑦𝑦
𝑥𝑥 = 𝑢𝑢 + 𝑣𝑣, 𝑦𝑦 = 𝑣𝑣
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕
1 1
𝐽𝐽 = �𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 � = � �=1
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 0 1
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕
Hence the jpdf of U and V is
𝑓𝑓(𝑢𝑢, 𝑣𝑣) = 𝑓𝑓(𝑥𝑥, 𝑦𝑦)|𝐽𝐽|
= 𝑒𝑒 − (𝑥𝑥+𝑦𝑦) (1) = 𝑒𝑒 − (𝑢𝑢+𝑣𝑣+𝑣𝑣) = 𝑒𝑒 − (𝑢𝑢+2𝑣𝑣)
Range space:
Given 𝑦𝑦 ≥ 0 ⇒ 𝑣𝑣 ≥ 0
𝑥𝑥 ≥ 0 ⇒ 𝑢𝑢 + 𝑣𝑣 ≥ 0
⇒ 𝑢𝑢 ≥ −𝑢𝑢
For the region 𝑢𝑢 < 0, 𝑣𝑣 𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 − 𝑢𝑢 𝑡𝑡𝑡𝑡 ∞. 𝑖𝑖. 𝑒𝑒. , −𝑢𝑢 < 𝑣𝑣 < ∞
For the region 𝑢𝑢 > 0, 𝑣𝑣 𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 0 𝑡𝑡𝑡𝑡 ∞. 𝑖𝑖. 𝑒𝑒. , 0 < 𝑣𝑣 < ∞
To find the density of 𝑈𝑈 = 𝑋𝑋 − 𝑌𝑌, we have to find 𝑓𝑓𝑈𝑈 (𝑢𝑢) for the regions:
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
(𝑖𝑖) 𝑢𝑢 < 0, −𝑢𝑢 < 𝑣𝑣 < ∞
(𝑖𝑖𝑖𝑖) 𝑢𝑢 > 0, 𝑣𝑣 > 0
∞ ∞
⎧ � 𝑔𝑔(𝑢𝑢, 𝑣𝑣) 𝑑𝑑𝑑𝑑 = � 𝑒𝑒 − (𝑢𝑢+2𝑣𝑣) 𝑑𝑑𝑑𝑑 𝑓𝑓𝑓𝑓𝑓𝑓 𝑢𝑢 < 0
⎪ −𝑢𝑢 −𝑢𝑢
𝑓𝑓𝑈𝑈 (𝑢𝑢) = ∞
⎨ = � 𝑒𝑒 − (𝑢𝑢+2𝑣𝑣) 𝑑𝑑𝑑𝑑 𝑓𝑓𝑓𝑓𝑓𝑓 𝑢𝑢 > 0
⎪
⎩ ∞
0
𝑒𝑒 −2𝑣𝑣 𝑒𝑒 𝑢𝑢
⎧𝑒𝑒 −𝑢𝑢
� � = 𝑓𝑓𝑓𝑓𝑓𝑓 𝑢𝑢 < 0
⎪ −2 −𝑢𝑢 2
𝑓𝑓𝑈𝑈 (𝑢𝑢) = ∞
⎨ −𝑢𝑢 𝑒𝑒 −2𝑣𝑣 𝑒𝑒 −𝑢𝑢
⎪𝑒𝑒 � � = 𝑓𝑓𝑓𝑓𝑓𝑓 𝑢𝑢 > 0
⎩ −2 0 2
𝑒𝑒 −|𝑢𝑢 |
Hence the pdf is 𝑓𝑓𝑈𝑈 (𝑢𝑢) = for −∞ < 𝑢𝑢 < ∞
2
13(a)(i) Show that the random process 𝑋𝑋(𝑡𝑡) = 𝐴𝐴 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) is a wide sense stationary, where
A and 𝜔𝜔 are constants and 𝜃𝜃 is uniformly distributed on the interval (0,2𝜋𝜋)
Solution:
1
∴ 𝑓𝑓(𝜃𝜃) = 0 ≤ 𝜃𝜃 ≤ 2𝜋𝜋
2𝜋𝜋
We have to prove X(t) is a WSS process using the following
2𝜋𝜋
= 𝐴𝐴 ∫0 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) 𝑓𝑓( 𝜃𝜃) 𝑑𝑑𝑑𝑑
2𝜋𝜋
1
= 𝐴𝐴 � cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) 𝑑𝑑𝑑𝑑
2𝜋𝜋
0
𝐴𝐴 𝐴𝐴
= [sin(𝜔𝜔𝜔𝜔 + 𝜃𝜃)]2𝜋𝜋
0 = [𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 − 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠]
2𝜋𝜋 2𝜋𝜋
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
= 0 = 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
𝐴𝐴2 𝐴𝐴2
= 𝐸𝐸[cos(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃)] + E[cos(𝜔𝜔𝜔𝜔)]
2 2
2𝜋𝜋
𝐴𝐴2 𝐴𝐴2
= � cos(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃) 𝑓𝑓( 𝜃𝜃) 𝑑𝑑𝑑𝑑 + 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
2 2
0
2𝜋𝜋
𝐴𝐴2 𝑠𝑠𝑠𝑠𝑠𝑠(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃) 𝐴𝐴2
= � � + 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
4𝜋𝜋 2 0
2
𝐴𝐴2 𝐴𝐴2
= [𝑠𝑠𝑠𝑠𝑠𝑠(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔) − 𝑠𝑠𝑠𝑠𝑠𝑠(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)] + 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
8𝜋𝜋 2
𝐴𝐴2 𝐴𝐴2
= (0) + 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
8𝜋𝜋 2
𝐴𝐴2
= 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 = 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝜏𝜏
2
Since 𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 0 𝑎𝑎𝑎𝑎𝑎𝑎 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) 𝑖𝑖𝑖𝑖 𝑎𝑎 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜
13(a)(ii) The process {𝑋𝑋(𝑡𝑡)} whose probability distribution under certain condition is given by
(𝑎𝑎𝑎𝑎 )𝑛𝑛 −1
𝑛𝑛 −1 , 𝑛𝑛 = 1,2,3, …
𝑃𝑃{𝑋𝑋(𝑡𝑡) = 𝑛𝑛} = �(1+𝑎𝑎𝑎𝑎 ) 𝑎𝑎𝑎𝑎
, 𝑛𝑛 = 0
1+𝑎𝑎𝑎𝑎
Mean and variance of the process. Is the process first order stationary?
X(t)=n 0 1 2 3 ...
P{X(t)}=p(x 𝑎𝑎𝑎𝑎 1 𝑎𝑎𝑎𝑎 (𝑎𝑎𝑎𝑎)2 ...
n) 1 + 𝑎𝑎𝑎𝑎 (1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎)3 (1 + 𝑎𝑎𝑎𝑎)4
Mean 𝐸𝐸{𝑋𝑋(𝑡𝑡)} =
∑∞
𝑛𝑛=0 𝑛𝑛𝑛𝑛(𝑥𝑥𝑛𝑛 )
𝑎𝑎𝑎𝑎 1 𝑎𝑎𝑎𝑎 (𝑎𝑎𝑎𝑎)2
=0 +1 + 2 + 3 +⋯
1 + 𝑎𝑎𝑎𝑎 (1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎)3 (1 + 𝑎𝑎𝑎𝑎)4
1 𝑎𝑎𝑎𝑎 𝑎𝑎𝑎𝑎 2
= �1 + 2 + 3� � + ⋯�
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎 1 + 𝑎𝑎𝑎𝑎
1 𝑎𝑎𝑎𝑎 −2
= �1 − �
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎
1 1 + 𝑎𝑎𝑎𝑎 − 𝑎𝑎𝑎𝑎 −2
= � �
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎
1 1
= 2
×
(1 + 𝑎𝑎𝑎𝑎) (1 + 𝑎𝑎𝑎𝑎)−2
1
= (1 + 𝑎𝑎𝑎𝑎)2
(1 + 𝑎𝑎𝑎𝑎)2
= 1, 𝑤𝑤ℎ𝑖𝑖𝑖𝑖ℎ 𝑖𝑖𝑖𝑖 𝑎𝑎 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
Now
∞
𝑛𝑛=0
∞
= �[𝑛𝑛(𝑛𝑛 + 1) − 𝑛𝑛]𝑝𝑝(𝑥𝑥𝑛𝑛 )
𝑛𝑛=0
∞ ∞
13(b) State the postulates of Poisson process and derive the probability distribution.Also prove
that the sum of two independent poisson process is again a poisson process.
Solution: If X(t) represents the number of occurences of a certain event in (0,t) then the discrete
random process {X(t)} is called the poisson process, provided the following postulates are
satisfied.
4. 𝑋𝑋(𝑡𝑡) is independent of the number of occurences of the event in any interval before and after
interval (0,t).
5. The probability that an event occurs a specified number of times in (𝑡𝑡0 , 𝑡𝑡0 + 𝜏𝜏) depends only
on 𝜏𝜏 and not on 𝑡𝑡0
Hence 𝑋𝑋1 (𝑡𝑡) + 𝑋𝑋2 (𝑡𝑡) is a poisson process with parameter (⋋𝟏𝟏 +⋋𝟐𝟐 )𝒕𝒕
{
R(τ ) = λ2 for τ >∈
λ τ
= λ2 + 1 − for τ ≤∈
∈ ∈
Find the power spectral density of the process.
Solution:
2λ ∞ τ
= F (λ 2 ) + ∫ 1 − cos ωτdτ
∈ 0 ∈
∈
2 2λ τ sin ωτ 1 − cos ωτ
= F (λ ) + 1 − +
∈ ∈ ω ω ω 2
0
2λ 1 − cos ω ∈
= F (λ 2 ) +
∈ ω 2
ω ∈
4λ sin 2
2
= F (λ ) + 2
∈2 ω 2
4λ sin 2 (ω ∈ 2)
= 2πλ2δ (ω ) +
∈2 ω 2
2λ ∞ τ
= F (λ 2 ) + ∫ 1 − cos ωτdτ
∈ 0 ∈
∈
2 2λ τ sin ωτ 1 − cos ωτ
= F (λ ) + 1 − +
∈ ∈ ω ω ω 2
0
2λ 1 − cos ω ∈
= F (λ 2 ) +
∈ ω 2
ω ∈
4λ sin 2
2
= F (λ ) + 2
∈2 ω 2
4λ sin 2 (ω ∈ 2)
= 2πλ2δ (ω ) +
∈2 ω 2
𝜔𝜔 2 +9
14(a)(ii) Given the power spectral density of a continuous process as 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = . Find
𝜔𝜔 4 +5𝜔𝜔 2 +4
the mean square value of the process.
𝜔𝜔 2 +9
Given 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) =
𝜔𝜔 4 +5𝜔𝜔 2 +4
𝜔𝜔2 + 9
=
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
1 ∞
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = � 𝑆𝑆 (𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
The mean square value is given by
𝐸𝐸[𝑋𝑋 2 (𝑡𝑡)] = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
1 ∞
= � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
∞
1
= 2 � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑 ∵ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) 𝑖𝑖𝑖𝑖 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
2𝜋𝜋 0 𝑋𝑋𝑋𝑋
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
1 ∞ 𝜔𝜔2 + 9
= � 𝑑𝑑𝑑𝑑
𝜋𝜋 0 (𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
𝑢𝑢 + 9 = 𝐴𝐴(𝑢𝑢 + 4) + 𝐵𝐵(𝑢𝑢 + 1)
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑓𝑓𝑓𝑓𝑓𝑓 𝐴𝐴 𝑎𝑎𝑎𝑎𝑎𝑎 𝐵𝐵, 𝑤𝑤𝑤𝑤 𝑔𝑔𝑔𝑔𝑔𝑔
8 5
𝐴𝐴 = , 𝐵𝐵 = −
3 3
8 5
𝜔𝜔2 + 9 −
∴ = 3 + 3
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4) (𝑢𝑢 + 1) (𝑢𝑢 + 4)
8 5
2 (𝑡𝑡)]
1 ∞ 3
∞
3
𝐸𝐸[𝑋𝑋 = �� 𝑑𝑑𝑑𝑑 − � 𝑑𝑑𝑑𝑑�
𝜋𝜋 0 (𝜔𝜔 2 + 1) 0 (𝜔𝜔 2 + 4)
1 8 5 1 𝜔𝜔 ∞
= � [𝑡𝑡𝑡𝑡𝑡𝑡−1 𝜔𝜔]∞
0 − × �𝑡𝑡𝑡𝑡𝑡𝑡
−1
� �
𝜋𝜋 3 3 2 2 0
1 8 5
= � [𝑡𝑡𝑡𝑡𝑡𝑡−1 (∞) − 𝑡𝑡𝑡𝑡𝑡𝑡−1 (0)] − [𝑡𝑡𝑡𝑡𝑡𝑡−1 (∞) − 𝑡𝑡𝑡𝑡𝑡𝑡−1 (0)]�
𝜋𝜋 3 6
1 8 𝜋𝜋 5 𝜋𝜋 1 𝜋𝜋 8 5
= � × − × �= × � − �
𝜋𝜋 3 2 6 2 𝜋𝜋 2 3 6
11
=
12
Statement:
Let X(t) be a real WSS process with power density spectrum S XX (ω ) . Let X T (t ) be a
X (t ) ,−T < t < T
X T (t ) =
portion of the process X(t) in time interval –T to T. i.e., 0 , elsewhere
Let X T (ω ) be the Fourier transform of X T (t ) , then
T
= ∫X
−T
T (t )e −iωt dt
∫ X (t )e
−iωt
= dt
−T
T T
∫ X (t )e dt. ∫ X (t )e dt
i ωt − iωt
=
−T −T [ X (t ) is real]
T iω t T − i ωt
= ∫ X (t )e 1 dt . ∫ X (t )e 2 dt
1 1 2 2
−T −T
T T
= ∫ ∫ X (t ) X (t
−T −T
1 2 )e −iω ( t2 −t2 ) dt1dt 2
T T
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (t , t + τ )e −iωτ dtdτ
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (τ )e −iωτ dtdτ
lim 1
= T −t T
T → ∞ 2T
∫ R XX (τ )e dτ . ∫ dt
−iωτ
−T −t −T
lim 1
= T −t
T → ∞ 2T
∫R
−T −t
XX (τ )e −iωτ dτ .2T
lim
= T −t
T →∞
∫R
−T −t
XX (τ )e −iωτ dτ
∞
= ∫R
−∞
XX (τ )e −iωτ dτ = S XX (ω )
, by definition.
∴ S XX (ω ) = lim E X T (ω ) [ 2
]
T →∞ 2T
Hence proved.
14(b)(ii) The cross power spectrum of real random process X(t) and Y(t) is given by
{a + jbω , ω < 1
S XY (ω ) =
=0 , elsewhere
Find the cross correlation function.
Solution:
= − ω − τ 2
π τ 0 π τ 0
a sin τ b − cos τ sin τ
= − + 2 − (0 − 0)
π τ π τ τ
a b b
= sin τ + cos τ − 2 sin τ
πτ πτ πτ
15(a)(i) Show that if the input X(t) is WSS process for a linear system then output Y(t) is a WSS
process.
Sol: Let X(t) be a WSS process for a linear time invariant stable system with Y(t) as the output
process.
∞
Y (t ) = ∫ h(u ) X (t − u )du
−∞
Then where h(t ) is weighting function or unit impulse response.
∞
∴ E [Y (t )] = ∫ E[h(u ) X (t − u )]du
−∞
∞
= ∫ h(u ) E[ X (t − u )]du
−∞
∴ E[ X (t − u )] = µ X
∞ ∞
∴ E [Y (t )] = ∫ h(u )µ X du = µ X ∫ h(u )du
−∞ −∞
∞
∫ h(u )du
Since the system is stable , −∞ is finite
∴ E [Y (t )] is a constant.
∞ ∞
= E[ ∫ h(u1 ) E[ X (t − u1 )]du1 ∫ h(u 2 ) E[ X (t + τ − u 2 )]du 2 ]
−∞ −∞
∞ ∞
= E[ ∫ ∫ h(u1 )h(u 2 ) X (t − u1 ) X (t + τ − u 2 )du1 du 2 ]
− ∞− ∞
∞ ∞
= ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) E[ X (t − u1 ) X (t + τ − u 2 )]du1 du 2
Since X(t) is a WSS process, auto correlation function is only a function of time
difference
∞ ∞
∴ RYY (t , t + τ ) = ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) R XX (τ + u1 − u 2 )du1 du 2
15(a)(ii) Let X(t) be a wide sense stationary process which is the input to a linear time
Invariant system with unit impulse h(t) and output Y(t), then prove that
SYY (ω ) = H (ω ) S XX (ω ) where H (ω ) is
2
Fourier transform of h(t).
∞
Solution: =
Let Y (t )
−∞
∫ h(u ) X (t − u )du
∞
=
Y (t ) ∫ X (t − α )h(α )dα
−∞
∞
∴ X (t + τ )Y (t ) = ∫ X (t + τ ) X (t − α )h(α )dα
−∞
∞
E[ X (t + τ )Y (t )]= ∫ E{ X (t + τ ) X (t − α )}h(α )dα
−∞
Hence ∞
= ∫R
−∞
XX (τ + α )h(α )dα
∞
= ∫R
−∞
XX (τ − β )h(− β )d β
∞
E{Y (t )Y (t − τ=
)} ∫R
−∞
XY (t − α )h(α )dα
S XY (ω ) = S XX (ω ) H* (ω ) (3)
15(b)(i) For a input-output linear system ( X (t ).h(t ), Y (t ) ) , derive thew cross correlation function
RXY (τ ) and the output autocorrelation function RYX (τ ).
Solution: The cross correlation function between the input X(t) and the output Y(t) is given by
(i ) RXY (τ ) = h(τ ) * R XX (τ )
=
(ii) R YX (τ ) RXX (τ ) * h(−τ )
(i) =
Proof : RXY (t, t + τ ) E{X(t).Y(t + τ )} ...(1)
∞
t + τ ) h(t ) * X (=
Y (= t +τ ) ∫ h(ε ) X (t + τ − ε )d ε ... ( 2 )
−∞
∞
RXY (t, t + τ ) E X (t ) ∫ h(ε ) X (t + τ − ε )d ε
=
−∞
Substitute (2) in (1) ∞
= ∫ E{ X (t ) X (t + τ − ε }h(ε )d ε ... ( 3)
−∞
XY (τ )
R= ∫R
−∞
XX τ − ε )d ε h(ε )d ε which is the convolution RXY (τ ) = RXX (τ ) * h(ε )
.
From the above it is very clear that cross-correlation functions depends on τ and not
on absolute time ‘t’.
N0
15(b)(ii) A white Gaussian Noise X(t) with zero mean and spectral density is
2
applied to a low-pass filter.Determine the autocorrelation of the output Y(t).
Solution : Out of syllabus.
(Regulations 2008)
Time: Three hours Maximum: 100 marks
Answer ALL Questions
Part A
1. The moment generating function of a random variable X is given by M (t ) = e3( e −1) , What is
t
P[X=0].
(3e t ) (3e t ) 2
M (t ) = e 3(e −1) = e 3e e −3 =
1
1 + + + ...
t t
e3 1! 2!
But E e tx [ ] = M (t )
∞
Solution: Given ∑e
x =0
tx
p ( x) = M (t )
(3e t ) 1(e t ) 2
p (0) + e t p (1) + e 2t p (2) + ... =
1 + + 9 + ...
1! e3 2!
Equating the like terms p (1) = 3.
2. An experiment succeeds twice as often as it fails. Find the chance that in the next 4 trials,
there shall be atleast one success.
2 1
p= ,q= ,n=4
3 3
Solution: P ( X ≥ 1) = 1 − P ( X < 1) = 1 − P ( X = 0)
0 4
2 1 1 80
= 1 − 4c0 = 1 − =
3 3 81 81
6
f ( x, y=
) ( x + y 2 ), 0 ≤ x ≤ 1, o ≤ y ≤ 1
5
3. Find the marginal density functions of X and Y if =0 , otherwise
6
f ( x, y=
) ( x + y 2 ), 0 ≤ x ≤ 1, o ≤ y ≤ 1
5
=0 , otherwise
that
The marginal density fu ction of X is
∞ y =1
6 y3 6 1
1
6
f ( x) = ∫ f ( x, y )dy = ∫ ( x + y 2 )dy = xy + = x + , 0 ≤ x ≤ 1
−∞
50 5 3 y =0 5 3
−∞
50 5 2 x =0 5 2
4. Find the acute angle between the two lines of regression, assuming the two lines of
regression.
Solution :
1− 𝑟𝑟 2 𝜎𝜎𝑥𝑥 𝜎𝜎𝑦𝑦
If 𝜃𝜃 is the angle between two regression lines , then 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 = � � .
𝑟𝑟 𝜎𝜎𝑥𝑥 2 + 𝜎𝜎𝑦𝑦 2
6. Prove that the sum of two independent poisson process is again a poisson process.
Solution: To prove that the sum of two poisson processes is a poisson process.
𝒏𝒏
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝑒𝑒 −⋋1 𝑡𝑡 𝑒𝑒 −⋋2 𝑡𝑡 �
𝒏𝒏!
𝒓𝒓=𝟎𝟎
𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝒆𝒆−𝒕𝒕(⋋𝟏𝟏+⋋𝟐𝟐 ) � 𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏!
𝒓𝒓=𝟎𝟎
Hence 𝑋𝑋1 (𝑡𝑡) + 𝑋𝑋2 (𝑡𝑡) is a poisson process with parameter (⋋𝟏𝟏 +⋋𝟐𝟐 )𝒕𝒕
7. Find the variance of the stationary process {X(t)} whose autocorrelation function is given
by
−2 τ
RXX (τ )= 2 + 4e
Solution: Given RXX (τ )= 2 + 4e−2 τ
µ x2 = lim R(τ ) = R(∞) = 2 + 4e −2 ∞ 2 + 0 = 2.
τ →∞
Mean t ) µ=
= E[ X (= x 2
E[ X (t )] = lim R(τ ) = R(0) = 2 + 4 e −0 = 2 + 4 = 6
2
τ →0
∫ H (t)e
− iω t
H (ω ) = dt
−∞
c
1 − iωt 1 e − iωt
c
= ∫−= 2c e dt 2c −iω
c −c
1 e − iωt eiωt
= −
2c −iω −iω
eiω c − e − iω c sin ω c
= =
2iω c ωc
N0
, for ω ≤ WB
S NN (ω ) = 2
0 , elsewhere
Part B ( 5x16=80marks)
2(1 − x) 0 < x <1
11(a)(i) If the probability density of X is given by f ( x) = find its
0 otherwise
rth moment. Hence evaluate E[(2 X + 1)2 ]
2 ∫0 2 ∫0
= = e dx = 2
e dx 2
2 1
− − t
2 0
1 1 1 2 1
=
− 0 − = =
2 1 2 1 − 2t 1 − 2t
−t
2
=(1 − 2t ) =1 + 2t + (2t ) 2 + ...
−1
=1 + 2t + 4t 2 + ...
t t2
=
1+ 2+ 8 + ...
1! 2!
t
= E=
Mean [θ ] coefficient of= 2
1!
t2
= E[θ ]2 coefficient
= of 8
2!
Variance = E[θ ] − E[θ ]2 = 8 − 4 = 4
1 1 1
=
+ + + ...
2 3 4
∞
1
= ∑ − 1, it is a divergent series
n =1 n
x2
f ( x, y=
) xy 2 + , 0 ≤ x ≤ 2, 0 ≤ y ≤ 1
8
1 1
Compute P Y < , P X > 1 Y < , P ( X + Y ≤ 1)
2 2
x2
Solution: Given: f ( x, y )= xy 2 + , 0 ≤ x ≤ 2, 0 ≤ y ≤ 1
8
1 2
2
1 2 y3 1 2
(i ) P Y < =
2 ∫
0
f ( y )dy= ∫ 2 y 2 + dy=
0
3 3
+ y
3 0
2 1 1 1 1 1 1+ 2 3 1
= + − [0 + 0] = + = = =
3 8 3
2 12 6 12 12 4
1
1
2 2
2 x2
1
P X > 1, Y <
= 2
∫1 ∫0 xy + 8 dydx
(ii ) P X =
>1 Y <
2 1 1
P Y <
2 4
1
2
y3 x2 2
2 2
x x2 x 2 x3
=4 ∫ x + y dx =4 ∫ + dx =4 +
1
3 8 0 1
24 16 48 48 1
4 2 4 10 5
[(4 + 8) − (1 + 1)] =
2
= x + x3 = =
48 1 48 12 6
1 1− y x = 1− y
2 x2 1
x 2 2 x3
(iii ) P [ X + Y=
≤ 1] ∫∫ xy + = dxdy ∫0 2 y + 24 dy
0 0 8 x =0
1
(1 − y ) 2 2 (1 − y )3
= ∫0 2 y + 24 dy
( y − y 2 )2
(1 − y )3
1
= ∫ + dy
2 24
0
1
y + y − 2 y (1 − y )3
2 4 3
= ∫ + dy
0
2 24
1
1 y3 y5 y 4 1 (1 − y ) 4
= + − 2 +
2 3 5 4 12 −4 0
1 1 1 1 1 13
= − + −− =
2 5 2 3 48 480
12(a)(ii) If the independent random variables X and Y have variances 36 and 16 respectively,
find the correlation coefficient between (X+Y) and (X-Y).
E[XY]=E[X]E[Y]
U=X+Y, V=X-Y
σ=
V
2
= Var ( X − Y=
V ar(V) ) Var ( X ) − Var (Y ) = 36 − 16 = 20 ⇒ σ V = 20
36 − 16 + 0 − 0 20 20 5
=r = = =
52 20 52 20 52 13
12(b) If X and Y are independent random variables with probability density functions
) 4e −4 x ,
f X ( x= ) 2e −2 y
x ≥ 0; fY ( y= y≥0
X
(i) Find the density function of U= , V= X + Y
X +Y
(ii) Are U and V independent?
(iii) What is P(U>0.5)?
f x ( x) = 4e −4 x , x ≥ 0; f y ( y ) = 2e −2 y , y ≥ 0
(i ) f xy x, y ) = 8e − 4 x e − 2 y x ≥ 0; y ≥ 0
[f xy x, y ) = f x ( x) f y ( y ) sin ce x and y are independent ]
x
Solution: Solving the equations u = , v = x + y, we get
x+ y
x = uv, y = v − uv
v u
J= = v, ⇒ J = v
− v 1− u
The joint pdf of (u,v) is given by
f UV (u, v) = J f XY ( x, y ) = 8ve −2 v (1+u )
The range space : 0 ≤ u ≤ 1 & v ≥ 0.
f UV (u , v) = 8ve 2 v (1+u ) , 0 ≤ u ≤ 1 & v ≥ 0.
The pdf of u is given by
2
= , 0 ≤ u ≤1
(1 + u ) 2
The pdf of v is given by
∞
f V (v) = ∫ 8ve − 2 v (1+u ) du
0
= 4e − 2 v (1 − e − 2 v ) v ≥ 0
8e − 2 v (1 − e − 2 v )
(ii ) f U (u ) f V (v) = ≠ f UV (u , v) ⇒ U & V are not independent.
(1 + u ) 2
1
1
2 (1 + u ) − 2+1 1
(iii ) P(U > o.5) = ∫ du = 2 =
0.5 (1 + u ) − 2 + 1 0.5 3
2
Solution: Given
X(t)=n 0 1 2 3 ...
P{X(t)}=p(x n ) 𝑎𝑎𝑎𝑎 1 𝑎𝑎𝑎𝑎 2
(𝑎𝑎𝑎𝑎) ...
1 + 𝑎𝑎𝑎𝑎 (1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎)3 (1 + 𝑎𝑎𝑎𝑎)4
Mean 𝐸𝐸{𝑋𝑋(𝑡𝑡)} = ∑∞
𝑛𝑛=0 𝑛𝑛𝑛𝑛(𝑥𝑥𝑛𝑛 )
𝑛𝑛=0
∞
= � [𝑛𝑛(𝑛𝑛 + 1) − 𝑛𝑛]𝑝𝑝(𝑥𝑥𝑛𝑛 )
𝑛𝑛=0
∞ ∞
= � 𝑛𝑛𝑃𝑃𝑛𝑛 (𝑡𝑡)
𝑛𝑛=0
∞
(𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆
= � 𝑛𝑛
𝑛𝑛!
𝑛𝑛=0
∞
−𝜆𝜆𝜆𝜆
(𝜆𝜆𝜆𝜆)𝑛𝑛
= 𝑒𝑒 � 𝑛𝑛
𝑛𝑛(𝑛𝑛 − 1)!
𝑛𝑛=1
𝑛𝑛=0
∞
(𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆
= �[𝑛𝑛(𝑛𝑛 − 1) + 𝑛𝑛]
𝑛𝑛!
𝑛𝑛=0
∞ ∞
(𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆
= �[𝑛𝑛(𝑛𝑛 − 1)] + � 𝑛𝑛
𝑛𝑛! 𝑛𝑛!
𝑛𝑛=0 𝑛𝑛=0
∞
𝑛𝑛
(𝜆𝜆𝜆𝜆)
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 � 𝑛𝑛 (𝑛𝑛 − 1) + 𝜆𝜆𝜆𝜆
𝑛𝑛(𝑛𝑛 − 1)(𝑛𝑛 − 2)!
𝑛𝑛=2
∞
(𝜆𝜆𝜆𝜆)𝑛𝑛
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 � + 𝜆𝜆𝜆𝜆
(𝑛𝑛 − 2)!
𝑛𝑛=2
(𝜆𝜆𝜆𝜆)2 (𝜆𝜆𝜆𝜆)3 (𝜆𝜆𝜆𝜆)4
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 � + + + ⋯ � + 𝜆𝜆𝜆𝜆
0! 1! 2!
(𝜆𝜆𝜆𝜆)1 (𝜆𝜆𝜆𝜆)2
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)2 �1 + + + ⋯ � + 𝜆𝜆𝜆𝜆
1! 2!
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)2 𝑒𝑒 𝜆𝜆𝜆𝜆 + 𝜆𝜆𝜆𝜆
= (𝜆𝜆𝜆𝜆)2 + 𝜆𝜆𝜆𝜆
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸[𝑋𝑋 2 (𝑡𝑡)] − (𝐸𝐸[𝑋𝑋(𝑡𝑡)])2
= (𝜆𝜆𝜆𝜆)2 + 𝜆𝜆𝜆𝜆 − (𝜆𝜆𝜆𝜆)2
= 𝜆𝜆𝜆𝜆
Since 𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] is a function of t. Hence Poisson process is not first order stationary.
13(b)(ii)
Prove that a random telegraph signal process 𝑌𝑌(𝑡𝑡) = 𝛼𝛼 𝑋𝑋(𝑡𝑡) is a Wide Sense Stationary process where
𝛼𝛼 is a random variable which is independent of X(t) and assumes value -1 and +1 with equal
probability and 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 𝑒𝑒 − 2𝜆𝜆|𝑡𝑡 1 −𝑡𝑡2 |
Let 𝑌𝑌(𝑡𝑡) = 𝛼𝛼 𝑋𝑋(𝑡𝑡)
1
𝑃𝑃(𝛼𝛼 = 1) = 𝑃𝑃(𝛼𝛼 = −1) =
2
By the definition 𝐸𝐸(𝛼𝛼) = 0, 𝐸𝐸(𝛼𝛼 2 ) = 1
To prove Y(t) is a WSS process
To Prove (i) 𝐸𝐸[𝑌𝑌(𝑡𝑡)] = 𝑎𝑎 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
(ii) 𝑅𝑅(𝑡𝑡1 , 𝑡𝑡2 ) = 𝑎𝑎 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝑡𝑡1 − 𝑡𝑡2
14(a)(i)If { X (t )} & {Y(t )} are two random processes with autocorrelation function RXX (τ ) & RYY (τ )
Then prove that |𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)| ≤ �𝑅𝑅𝑋𝑋𝑋𝑋 (0)𝑅𝑅𝑌𝑌𝑌𝑌 (0). Establish any two properties of auto correlation
function RXX (τ ) .
Solution: By Cauchy-Schwartz inequality, we have
[ ][
E [X (t )Y (t + τ )] ≤ E X 2 (t ) E Y 2 (t + τ )
2
]
⇒ [R XY (τ )]2 ≤ R XX (0) RYY (0)
[ E[ X 2
(t ) = R XX (0) by A.C.F property ]
⇒ R XY (τ ) ≤ R XX (0) RYY (0)
14(a)(ii) Find the power spectral density of the random process whose autocorrelation function is
{1 − τ ,
R(τ ) = for τ ≤ 1
=0 elsewhere
{1 − τ ,
R(τ ) = for τ ≤ 1
∫ R(τ )e
−iατ
ie., S (ω ) = dτ
−∞
1
= ∫ (1 − τ )e −iατ dτ
−1
1
= ∫ (1 − τ )(cos ωτ − i sin ωτ )dτ
−1
1 1
= ∫ (1 − τ )(cos ωτ )dτ − i ∫ (1 − τ )(sin ωτ )dτ
−1 −1
1
= 2 ∫ (1 − τ ) cos ωτdτ
0
1
− sin ωτ − cos ωτ
= 2 (1 − τ ) − (−1)
ω ω 0
2
− cos ω 1 2(1 − cos ω )
= 2 + 2=
ω ω ω2
2
14(a)(ii) Find the power spectral density of the random process whose auto correlation function
is given by
1 − |𝜏𝜏| 𝑖𝑖𝑖𝑖 |𝜏𝜏| ≤ 1
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = �
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
1 − |𝜏𝜏| 𝑖𝑖𝑖𝑖 |𝜏𝜏| ≤ 1
Given 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = �
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
∞
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
1
= � (1 − |𝜏𝜏|)𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−1
1
= � (1 − |𝜏𝜏|) (𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 − 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖) 𝑑𝑑𝑑𝑑
−1
1
1
= � (1 − |𝜏𝜏|) 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑 − 𝑖𝑖 �(1 − |𝜏𝜏|) 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑑𝑑𝑑𝑑
−1
−1
1
14(b) State and prove Wiener Khintchine theorem and hence find the power spectral density of a
WSS process X(t) which has an autocorrelation RXX (τ=) A0 [1 − τ / T ], − T ≤ t ≤ T
Statement:
Let X(t) be a real WSS process with power density spectrum S XX (ω ) . Let X T (t ) be a
X (t ) ,−T < t < T
X T (t ) =
portion of the process X(t) in time interval –T to T. i.e., 0 , elsewhere
Let X T (ω ) be the Fourier transform of X T (t ) , then
S XX (ω ) =
lim1
T → ∞ 2T
E X T (ω )
2
{ }
Proof: Given X T (ω ) is the Fourier transorm of X T (t )
∞
∴ X T (ω ) = ∫X
−∞
T (t )e −iωt dt
T
= ∫X
−T
T (t )e −iωt dt
∫ X (t )e
−iωt
= dt
−T
T T
∫ X (t )e dt. ∫ X (t )e −iωt dt
i ωt
=
−T −T [ X (t ) is real]
T iω t T − i ωt
= ∫ X (t )e 1 dt . ∫ X (t )e 2 dt
1 1 2 2
−T −T
T T
∂t1 ∂t1 1 0
= =1
J = ∂t ∂τ 1 1
∂t 2 ∂t 2
∂t ∂τ
dt1 dt 2 = J dtdτ
The limits of t and –T and T
When t 2 = −T ,τ = −T − t and t 2 = T ,τ = T − t
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (t , t + τ )e −iωτ dtdτ
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (τ )e −iωτ dtdτ
lim 1
= T −t T
T → ∞ 2T
∫R (τ )e dτ . ∫ dt
−iωτ
XX
−T −t −T
lim 1
= T −t
T → ∞ 2T
∫R
−T −t
XX (τ )e −iωτ dτ .2T
∞
= ∫R
−∞
XX (τ )e −iωτ dτ = S XX (ω )
, by definition.
∴ S XX (ω ) =lim E X T (ω ) [ 2
]
T →∞ 2T
Hence proved.
15(a)(i)Show that if the input X(t) is WSS process for a linear system then output Y(t) is a WSS
process.
Sol: Let X(t) be a WSS process for a linear time invariant stable system with Y(t) as the output
process.
∞
Y (t ) = ∫ h(u ) X (t − u )du
−∞
Then where h(t ) is weighting function or unit impulse response.
∞
∴ E [Y (t )] = ∫ E[h(u ) X (t − u )]du
−∞
∞
= ∫ h(u ) E[ X (t − u )]du
−∞
∴ E[ X (t − u )] = µ X
∞ ∞
∴ E [Y (t )] = ∫ h(u )µ X du = µ X ∫ h(u )du
−∞ −∞
∞
∫ h(u )du
−∞
Since the system is stable , is finite
∴ E [Y (t )] is a constant.
∞ ∞
= E[ ∫ h(u1 ) E[ X (t − u1 )]du1 ∫ h(u 2 ) E[ X (t + τ − u 2 )]du 2 ]
−∞ −∞
Since X(t) is a WSS process, auto correlation function is only a function of time
difference
∞ ∞
∴ RYY (t , t + τ ) = ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) R XX (τ + u1 − u 2 )du1 du 2
15(a)(ii) If X(t) is the input voltage to a circuit and Y(t) is the output voltage {X(t)} is a
−2 τ
stationary process with µ X = 0 & R XX (τ ) = e . Find the mean µ Y and the power spectrum
1
S YY (ω ) of the output if the system transfer function is given by H (ω ) = .
ω + 2i
(i ) µ X = 0 ie., E[ X (t )] = 0
−2 τ
Solution: Given : (i) (ii R XX (τ ) = e
1
(iii ) H (ω ) =
ω + 2i
(2) S YY (ω )
S YY (ω ) = H (ω ) S XX (ω )
2
...( A)
1
(iii ) ⇒ H (ω ) =
2
ω +4
2
−2 τ
S XX (ω ) = F [ R XX (τ )] = F [e ] by (ii )
(2) 4
=
ω +4 2
1 4
( A) ⇒ S YY (ω ) = 2
ω +4ω +4 2
4
= 2
(ω + 4) 2
Y (t=
1 )Y (t 2 ) { A cos(ω0t1 + θ ) + N (t1 )}{ A cos(ω0t2 + θ ) + N (t2 )}
= A2 cos(ω0 t1 + θ ) cos(ω0 t2 + θ ) + N (t1 ) N (t2 ) + A cos(ω0 t1 + θ ) N (t2 ) + A cos(ω0 t2 + θ ) N (t1 )
Solution:
R=
YY (t1 , t 2 ) A2 E [ cos(ω0 t1 + θ ) cos(ω0 t2 + θ ) ] + RNN (t1, t2 ) + AE cos(ω0 t1 + θ ) E [ N (t2 ) ]
+ AE cos(ω0 t2 + θ ) E [ N (t2 ) ]
15 (b)(ii) A system has an impulse response h(t ) = e − βttU (t ), find the power spectral density of
the output Y(t) corresponding to the input X(t).
1
h(t ) = e − βttU (t ) ⇒ H (ω ) = F [h(t )] = F [e − βttU (t )] =
β + iω
1
H (ω ) =
2
We know that S YY (ω ) = H (ω ) S XX (ω )
2
1
⇒ S YY (ω ) = S XX (ω )
β +ω22
(Regulations 2008)
Time: Three hours Maximum: 100 marks
PART A
Solution:
∞ n
2
∑ P[ X = n] = 1, ∑n =1
C = 1
3
2 1 2 2
C + + ...... = 1
3 3
−1
2 2
C 1 − = 1 {1 + x + x 2 + ...} = (1 − x) −1 }
3 3
1
⇒C =
2
2. The probability that a man shooting a target is 1/4. How many times must he fire so that
the probability of hitting the target atleast once is more than 2/3?
Solution:
P[ X ≥ 1) > 2 / 3
1 − P( X < 1) > 2 / 3 ⇒ 1 − P( X = 0) > 2 / 3
Then q=1-p. To find 'n' such that
⇒ 1 − q n > 2 / 3 ⇒ (3 / 4) n < 1 / 3
3.Lett X and Y be two discrete random variables with joint probability mass function
1
P[ X = x, Y = y ] = (2 x + y ), x = 1,2 & y = 1,2
8
=0 otherwise
Solution:
1 2 1
PX ( x) = ∑ Pxy ( x, y ) = ∑ (2 x + y ) = (4 x + 3), x = 1,2
y 8 y =1 18
1 2 1
PY ( y ) = ∑ Pxy ( x, y ) = ∑ (2 x + y ) = (2 y + 6), y = 1,2
x 8 x =1 18
A random process {X(t)} is called Wide sense stationary if its mean is constant
and autocorrelation function depends only on the time difference
i.e (i) E(X(t)) is always a constant
(ii)E(X(t)X(t+𝜏𝜏))= R xx (𝜏𝜏)
6.If {X(t)} is a normal process with µ(t)=10 and C(t 1 ,t 2 )=16 𝒆𝒆−|𝒕𝒕𝟏𝟏 −𝒕𝒕𝟐𝟐 | find the
variance of X(10)-X(6)
𝜎𝜎𝑢𝑢=√31.4139
=5.6048
9
R (τ ) = 16 + .
7. The auto correlation function of a stationary random process is 1 + 6τ 2 Find
the mean and variance of the process.
Solution:
9
R(τ ) = 16 +
1 + 6τ 2
9
[ E ( X (t )]2 . = lim R(τ ) = lim 16 + = 16
τ →∞ τ →∞ 1 + 6τ 2
∴ E{ X (t )} = 4
Given
{E ( X 2 (t )} = R(0) = 16 + 9 = 25
var iance σ 2 = {E ( X 2 (t )} − [ E ( X (t )] 2 = 25 − 16 = 9
S xy (ω ) = S yx (−ω )
8. Prove that
Solution:
∞ ∞
∫ Rxy (τ )e dτ = ∫R
−iωτ
S xy (ω ) = xy (−τ )e −iωτ dτ
−∞ −∞
Put τ 1 = −τ , dτ 1 = − dτ
When τ = ∞, τ 1 = −∞
τ = −∞, τ 1 = ∞
∞ ∞
∫ Rxy (τ 1 )e 1 (−dτ 1 ) = ∫R
−iωτ
S xy (ω ) = xy (τ 1 )e iωτ 1 dτ 1 = S yx (−ω )
−∞ −∞
Solution:
y (t + h) = f ( X (t + h), where y (t ) = f [ X (t )]
Let y (t ) = a1 X 1 (t ) + a 2 X 2 (t ), then
A system is said to be time-invariant if ∞
Hence time-invariant.
Solution:
∞
If a system is of the form y (t ) = ∫ h(u ) X (t − u )du then the system weighting function
−∞
h(t)
is also called unit impulse resonse of the system. It is called so because Y(t)=h(t) where X(t)=
the unit impulse function δ (t ).
PART B (5X16=80marks)
11(a)(i)
1 2 2 3 1 81
= + + + + =
10 10 10 10 100 100
81 19
𝑃𝑃(𝑋𝑋 ≥ 6) = 1 − 𝑃𝑃(𝑋𝑋 < 6) = 1 − =
100 100
𝑃𝑃(0 < 𝑋𝑋 < 5) = 𝑃𝑃(𝑋𝑋 = 0) + 𝑃𝑃(𝑋𝑋 = 1) + ⋯ . +𝑃𝑃(𝑋𝑋 = 4)
1 2 2 3
= + + +
10 10 10 10
8 4
= =
10 5
1
(𝑖𝑖𝑖𝑖𝑖𝑖) 𝑃𝑃(𝑋𝑋 ≤ 3) =
2
8 1
𝑃𝑃(𝑋𝑋 ≤ 4) = >
10 2
𝑇𝑇ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑛𝑛 = 4
(𝑖𝑖𝑖𝑖)The distribution function 𝐹𝐹𝑋𝑋 (𝑥𝑥) of X is given by
X 0 1 2 3 4 5 6 7
0 1 3 5 8 81 83 100
=1
𝐹𝐹𝑋𝑋 (𝑥𝑥) = 𝑃𝑃(𝑋𝑋 10 10 10 10 100 100 100
≤ 𝑥𝑥)
11(a)(ii) Find the M.G.F.of the random variable X having the probability density function
x −
x
=f ( x) e 2 , x>0
4 Also deduce the first four moments about the origin.
=0 , elsewhere
x −
x
= f ( x) e 2 , x>0
Solution: Given 4
=0 , elsewhere
∞ ∞ ∞ 1
x − 2x 1 − −t x
M= E=
[etX ] ∫ e f (= ∫e e= ∫ xe 2 dx
tx tx
X (t ) x)dx dx
0 0
4 40
∞
1 1
− 2 − t x
1 e 2
− −t x
e
= x − (1)
4 1 1
2
− 2 − t − − t
2 0
1 1=
(0 − 0) − 0 − =
1 4 1
=
4 1
2
4 (1 − 2t )2 (1 − 2t )
2
− t
2
(1 − 2t )
−2
= =
1 + 2(2t ) + 3(2t ) 2 + 4(2t )3 + ......
=1 + 4 t + 12 t 2 + 32t 3 + 80t 4 + ......
t t2 t3 t4
=1 + (4) + (24) + (192) + (1920) + ....
1! 2! 3! 4!
t
=∴µ1 coefficient
'
= of 4
1!
t2
= µ 2' coefficient
= of 24
2!
t3
= µ3' coefficient
= of 192
3!
t4
= µ 4' coefficient
= of 1920
4!
(b)(i) Given that X is distributed normally, if P[ X < 45] = 0.31 & P[ X > 64] = 0.08, find the
mean and standard deviation of the distribution.
Solution:
45 − µ
∴ = −0.5 ⇒ 45 − µ = −0.5σ (1)
σ
64 − µ
∴ = 1.4 ⇒ 64 − µ = 1.4σ (2)
σ
(ii) The time in hours required to repair a machine is exponenetially distributed with
1
parameter λ = .
2
B. What is the conditional probability that a repair takes atleast 10 hours given that its
duration exceeds 9 hours?
Solution:
1
Given X is exponentially distributed with λ = .
2
x
1 −2
∴ f ( x) = e , x>0
2
To find the probability that the repair time takes atleast 10 hours given that it
exceeds 9 hours
∞∞
−x
∇∞
1 e 2 −
x 1 1
1 − −
P( X / .10 X > 9) = P( X > 1) = ∫ e 2 dx = − 0 − e = e 2 = 0.6065
2
1
2 2 −1
2 1
12(a)(i) The joint p.d.f of the random variable (X,Y) is given b y 𝑓𝑓(𝑥𝑥, 𝑦𝑦) =
2 2
𝑘𝑘𝑘𝑘𝑘𝑘𝑒𝑒 −�𝑥𝑥 +𝑦𝑦 � ,
𝑥𝑥 > 0, 𝑦𝑦 > 0. Find the value of k and also prove that X and Y are independent.
Solution: Here the range space is the entire first quadrant of the XY-place.
∞ ∞
2 +𝑦𝑦 2 �
� � 𝑘𝑘𝑘𝑘𝑘𝑘𝑒𝑒 −�𝑥𝑥 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 = 1 … … … … . (1)
0 𝑐𝑐
∞ ∞
2 +𝑦𝑦 2 �
𝑘𝑘 � � 𝑥𝑥𝑥𝑥𝑒𝑒 −�𝑥𝑥 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 = 1
0 𝑐𝑐
∞ ∞
2 2
𝑘𝑘 �� 𝑥𝑥𝑒𝑒 −𝑥𝑥 𝑑𝑑𝑑𝑑� �� 𝑦𝑦𝑒𝑒 −𝑦𝑦 𝑑𝑑𝑑𝑑� = 1
0 0
𝑑𝑑𝑑𝑑
Put 𝑥𝑥 2 = 𝑡𝑡, 2𝑥𝑥𝑥𝑥𝑥𝑥 = 𝑑𝑑𝑑𝑑. 𝑖𝑖. 𝑒𝑒. , 𝑥𝑥𝑥𝑥𝑥𝑥 = .
2
When 𝑥𝑥 → 0 ⇒ 𝑡𝑡 ⇢ 0, 𝑥𝑥 ⇢ ∞ ⇒ 𝑡𝑡 ⟶ ∞
𝑑𝑑𝑑𝑑
Put 𝑦𝑦 2 = 𝑣𝑣 , 2𝑦𝑦𝑦𝑦𝑦𝑦 = 𝑑𝑑𝑑𝑑 , 𝑖𝑖𝑖𝑖. , 𝑦𝑦𝑦𝑦𝑦𝑦 =
2
As 𝑦𝑦 → 0 ⇒ 𝑣𝑣 ⇢ 0, 𝑦𝑦 ⇢ ∞ ⇒ 𝑣𝑣 ⇢ ∞
∞ 𝑑𝑑𝑑𝑑 ∞ 𝑑𝑑𝑑𝑑
(1) ⇒ 𝑘𝑘 �∫0 𝑒𝑒 −𝑡𝑡 � �∫0 𝑒𝑒 −𝑣𝑣 �=1
2 2
𝑒𝑒 −𝑡𝑡 𝑒𝑒 −𝑣𝑣
�� �=1 𝑘𝑘 �
2 2
−1 1
𝑘𝑘 �0 − � �� �0 − � �� = 1.
2 −2
1 1
𝑘𝑘 � � � � = 1
2 2
𝑘𝑘
= 1.
4
The marginal density of X is given by
∞ ∞
2 +𝑦𝑦 2 �
𝑓𝑓𝑋𝑋 (𝑥𝑥) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑 = � 4𝑥𝑥𝑥𝑥 𝑒𝑒 −�𝑥𝑥 𝑑𝑑𝑑𝑑
0 0
∞
2 2
= � 4𝑥𝑥𝑥𝑥 𝑒𝑒 −𝑥𝑥 𝑒𝑒 −𝑦𝑦 𝑑𝑑𝑑𝑑
0
∞
−𝑥𝑥 2 2
= 4𝑥𝑥𝑒𝑒 � 𝑦𝑦 𝑒𝑒 −𝑦𝑦 𝑑𝑑𝑑𝑑
0
2 1
= 4𝑥𝑥𝑒𝑒 −𝑥𝑥 � �
2
−𝑥𝑥 2
= 2𝑥𝑥𝑒𝑒 , 𝑥𝑥 > 0
The marginal density of Y is given by
∞ ∞
2 +𝑦𝑦 2 �
𝑓𝑓𝑦𝑦 (𝑦𝑦) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑 = � 4𝑥𝑥𝑥𝑥 𝑒𝑒 −�𝑥𝑥 𝑑𝑑𝑑𝑑
0 0
∞
2 2
= � 4𝑥𝑥𝑥𝑥 𝑒𝑒 −𝑥𝑥 𝑒𝑒 −𝑦𝑦 𝑑𝑑𝑑𝑑
0
∞
2 2
= 4𝑦𝑦𝑒𝑒 −𝑦𝑦 � 𝑥𝑥 𝑒𝑒 −𝑥𝑥 𝑑𝑑𝑑𝑑
0
2 1
= 4𝑦𝑦𝑒𝑒 −𝑦𝑦 � �
2
−𝑦𝑦 2
= 2𝑦𝑦𝑒𝑒 , 𝑦𝑦 > 0.
If 𝑓𝑓𝑋𝑋 (𝑥𝑥)𝑓𝑓
𝑦𝑦 (𝑦𝑦) = 𝑓𝑓(𝑥𝑥, 𝑦𝑦) then X and Y are independent.
2 2 2 +𝑦𝑦 2 �
𝑓𝑓𝑋𝑋 (𝑥𝑥)𝑓𝑓𝑦𝑦 (𝑦𝑦) = �2𝑥𝑥𝑒𝑒 −𝑥𝑥 ��2𝑦𝑦𝑒𝑒 −𝑦𝑦 � = 4𝑥𝑥𝑥𝑥𝑥𝑥 −�𝑥𝑥 = 𝑓𝑓(𝑥𝑥, 𝑦𝑦). Hence X and Y
are independent.
(ii) If X and Y are uncorrelated random variables with variances 16 and 9. Find the
correlation coefficient between X+Y and X-Y.
Solution:
Let U=X+Y and V=X-Y, then E(U)= E(X)+E(Y) and E(V)= E(X)-E(Y)
∴ Cov (u , v) = σ x − σ y
2 2
σ x2 −σ y2
ruv =
If r is then correlation coefficient between u and v, σ x2 + σ y2
7
Given σ x = 16 & σ y = 9 ∴ r =
2 2
25
12(b)(i) If the p.d.f of a two dimensional random variable (X,Y) is given by 𝑓𝑓(𝑥𝑥, 𝑦𝑦) =
𝑥𝑥 + 𝑦𝑦 , 0 ≤ 𝑥𝑥, 𝑦𝑦 ≤ 1. Find the p.d.f of U=XY.
𝑢𝑢
Solution: 𝑢𝑢 = 𝑥𝑥𝑥𝑥 , 𝑣𝑣 = 𝑦𝑦. Hence 𝑥𝑥 = and 𝑦𝑦 = 𝑣𝑣.
𝑣𝑣
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 1
0 1
𝐽𝐽 = �𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 � = � 𝑣𝑣
−𝑢𝑢 �=
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 𝑣𝑣
1
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 𝑣𝑣 2
Since , 0 ≤ 𝑦𝑦 ≤ 1 , 0 ≤ 𝑣𝑣 ≤ 1, , 0 ≤ 𝑥𝑥 ≤ 1 ⇒ , 0 ≤ 𝑢𝑢 ≤ 𝑣𝑣
𝑣𝑣varies from 𝑣𝑣 = 𝑢𝑢 to 𝑣𝑣 = 1.
=R(𝜏𝜏)
Therefore {X(t)} is correlation ergodic
.
(a)more than 1 minute (b) between 1 minute and 2 minute and (c) 4
minute or less
Solution: Let T be the random variable denoting inter arrival time.
By Property (3)
f T (t)= λe − λt and P(T>t)= e − λt
Here λ =2/minute
−2 t
Therefore f T (t)=2 e
∞
(a) P(T>1)=∫1 2𝑒𝑒 −2𝑡𝑡 𝑑𝑑𝑑𝑑 = 0.1353
2
(b) P(1≤T≤2)=∫1 2𝑒𝑒 −2𝑡𝑡 𝑑𝑑𝑑𝑑 = 0.117
4
(c) P(T≤4) =∫0 2𝑒𝑒 −2𝑡𝑡 𝑑𝑑𝑑𝑑 = 0.99967
−0.2 τ
13(b)(ii) Suppose that X(t) is a Gaussian process with µ x = 2, R XX (τ ) = 5e
Find the probability that X (4) ≤ 1.
Solution: Out of syllabus.
14(a)(i) A stationary random process X(t) with mean 2 has the autocorrelation
τ 1
R xx (τ ) = 4 + e 10 . Find the mean and variance of Y = ∫ X (t )dt
0
Solution:
1 1 1
E[Y ] = E ∫ X (t )dt = ∫ E[ X (t )]dt = 2 ∫ dt = 2
0 0 0
Solution:
Power spectral density is
∞
S Xx (ω ) = ∫R
−ℵ
XX (τ )e iωτ dτ
∞
A2
∫−∞ 2 cos(ω 0τ )e dτ
−iωτ
=
∞
A2
=
2 −∞
∫ cos(ω τ )(cos ωτ − i sin ωτ )dτ
0
A2
∞
∞
= ∫ [cos(ω − ω 0 )τ + cos(ω + ω 0 )τ ] d τ − i ∫ [sin(ω + ω 0 )τ + sin(ω − ω 0 )τ ]dτ
4 −∞ −∞
{2 cos A cos B = cos( A − B) + cos( A + B), 2 sin A sin B = sin( A + B) + sin( A − B)}
∞ ∞
A2 A2
4 −∫ℵ ∫e
−i (ω +ω0 )τ −i (ω −ω0 )τ
S XX (ω ) = e dτ + dτ ...(**)
4 −ℵ
By defn of dirac delta function
∞ ∞
1
∫e ∫ S (ω )dω = 1
−iωτ
S (ω ) = dτ such that
2π −ℵ −ℵ
∞
2π ∫ e −i (ω +ω )0 τ dτ = δ (ω + ω 0 )
−ℵ
∴ (**) becomes
πA 2
S XX (ω ) = [δ (ω + ω 0 ) + δ (ω − ω 0 )
2
14(b)(i) The cross correlation function of two processes X(t) and Y(t) is given by
AB
R XY (t , t + τ ) = {sin(ω 0τ ) + cos ω 0 (2t + τ )} where A,B and ω 0 are constants. Find
2
the cross power spectrum S XY (ω ).
Solution:
T
1
= lim
τ →∞ 2T ∫R
−T
XY (t , t + τ )dt
T
1 AB
Time average = lim
τ →∞ 2T ∫
−T
2
{sin(ω 0τ ) + cos ω 0 ( 2t + τ )}dt
AB
T T
1 1
= lim
2 τ →∞ 2T −T
∫ {sin(ω τ )dt + τlim 2T ∫ {cos(ω
0
→∞
−T
0 ( 2t + τ )dt
T
AB 1
= sin(ω 0τ ) + lim sin(ω 0 ( 2t + τ )
2 τ →∞ 2T
−T
AB
= sin(ω 0τ )
2
∞
AB
S XY (ω ) = ∫
−∞
2
sin(ω 0τ )e −iωτ dτ
∞
sin(ω 0τ )[cos ωτ − i sin ωτ ]dτ
AB
2 −∫ℵ
=
AB
∞
= ∫ sin(ω 0τ )[cos ωτ − i sin ωτ ]dτ
2 −ℵ
AB
∞ ∞
= ∫ sin(ω 0τ ) cos ωτdτ − i ∫ sin(ω 0τ ) sin ωτdτ
2 −ℵ −∞
AB
∞
= ∫ [sin(ω 0 + ω )τ ) + sin(ω 0 − ω )τ − i cos(ω − ω 0 )τ + i cos(ω + ω 0τ ]dτ
4 −∞
AB
− i ∫ [e
−i (ω −ω0 )τ
= − e −i (ω +ω0 )τ ]dτ
4
− iAB
.
= [2πδ (ω − ω 0 ) − 2πδ (ω + ω 0 )]
4
− iπAB
Hence S XY (ω ) = [δ (ω − ω 0 ) − δ (ω + ω 0 )]
2
14(b)(ii) Let X(t) and Y(t) be both zero-mean and WSS random processes.Consider
the random process Z(t) defined by Z(t)=X(t)+Y(t). Find
A. The autocorrelation function and the power spectrum of Z(t) if X(t) and Y(t) are
jointly WSS.
B. The power spectrum of Z(t) if X(t) and Y(t) are orthogonal.
Solution:
The autocorrelation function and the power spectrum of Z(t) if X(t) and Y(t)
are jointly WSS.
Autocorrelation function of Z(t) is given by
RZZ (t1 , t 2 ) = E[ Z (t1 ) Z (t 2 )]
= E{( X (t1 ) + Y (t1 )}{ X (t 2 ) + Y (t 2 )}
RZZ (t1 , t 2 ) = R XX (t1 , t 2 ) + R XY (t1 , t 2 ) + RYX (t1 , t 2 ) + RYY (t1 , t 2 )
If X(t) and Y(t) are jointly WSS, then
RZZ (τ ) = R XX (τ ) + R XY (τ ) + RYY (τ ) whereτ = (t 2 − t1 )
Taking Fourier transform on both sides, we obtain
S ZZ (ω ) = S XX (ω ) + S XY (ω ) + S YX (ω ) + S YY (ω )
If X(t) and Y(t) are orthogonal, RXX (τ ) = RYX (τ ) = 0 .
Then RZZ (τ ) = R XX (τ ) + RYY (τ ) and the Fourier transform of this result gives,
S ZZ (ω ) = S XX (ω ) + S YY (ω )
1
15(a)(i) Consider a system with transfer function . An input signal with
1 + jω
autocorrelation function mδ (τ ) + m 2 is fed as input to the system.Find the mean-
square value of the output.
Solution:
1
H (ω ) = . & R XX (τ ) = mδ (τ ) + m 2
1 + iω
S X (ω ) = m + 2πm 2δ (ω )
2
1
Given S Y (ω ) = H (ω ) S X (ω ) = {m + 2πm 2δ (ω )}
2
1 + iω
1
= {m + 2πm 2δ (ω )}
1+ ω 2
m −τ
RYY (τ ) is the Fourier inverse transform of S Y (ω ) . So RYY (τ ) = e
2
Hence mean of the output = RYY (∞) = m
m
Mean square value of the output = y 2 = RYY (0) = + m2
2
15(a)(ii) A stationary random process X(t) having the autocorrelation function
R XX (τ ) = Aδ (t )
is applied to a linear system at time t=0 where f (τ ) represent the impulse
function.The linear function has the impulse response of h(t ) = e − bt u (t ) where u(t)
represents the unit step function. Find RYY (τ ) .Also find the mean and variance of
Y(t).
Solution:
R XX (τ ) = Aδ (t )
∞
RYY (τ ) = ∫R
−∞
XX (t − τ ) h(t ) dt
∫ Aδ (t − τ )e
− bt
= u (t ) dt
−∞
∞
= A ∫ δ (t − τ )e −bt u (t ) dt
−∞
= Ae −bt
2
Y = RYY (∞) = 0
2
Y = RYY (0) = A ⇒ Variance (Y (t )) = A
∞
15(b)(b)(i) If {X(t)} is a WSS process and if Y(t)= Y (t ) = ∫ h(ξ ) X (t − ξ )dξ
−∞
R XX (τ ) with h(τ )
which is the convolution of
R XY (τ ) = R XX (τ ) * h(τ )
Taking Fourier transform, S XY (ω ) = S XX (ω ) H * (ω )
15(b)(ii) If {N(t)} is a band limited white noise centered at a carrier frequency ω 0
such that
N
S NN (ω ) = 0 , for ω − ω 0 < ω B
2
=0 elsewhere
Find the autocorrelation of {N(t)}.
Solution:
N
S NN (ω ) = 0 , for ω − ω 0 < ω B
2
=0 elsewhere
∞
1
R NN (τ ) =
2π ∫S
−∞
NN (ω )e iωτ dω
ω B +ω 0
1 N0
∫e
iωτ
= dω
2π 2 −ω B +ω0
1 N 0 e ( ω B + ω 0 ) iτ − e ( − ω B + ω 0 ) iτ
=
2π τ 2i
N ω sin ω Bτ
= 0 B (cos ω 0τ + i sin ω 0τ )
π 2 ω Bτ
Since R NN (τ ) is a real function,
N 0 ω B sin ω Bτ
R NN (τ ) = cos ω 0τ
π 2 ω Bτ
(Regulations 2008)
Time: Three hours Maximum: 100 marks
Part A
Solution: A real variable X whose value is deteerrmined by the outcome of a random experiment
is called a random variable.
Solution: A random variable X is said to have a geometric distribution with parameter p if the
probability mass function is given by
2)
− (x2 + y
f ( x, y ) = kxye , x > 0, y > o. Find the value of k.
∞ ℑ − ( x 2 + y 2)
∫∫ f ( x , y ) dxdy = 1 ⇒ k ∫ ∫ xye dxdy = 1
R 00
2
∞ − y2 ∞ − y2 ∞ − x 2
⇒ k ∫ ye dy ∫ ye dx = 1 ⇒ k ∫ xe dx = 1
0 0
0
x 2 = t ⇒ 2 xdx = dt
Put Whenx = ∞, t = ∞ & x = 0, t = 0
2 x , 0 < x <1
f ( x) = Find the pdf of y = 8x 3
0, elsewhere
4. Given the RV X with density function
Solution: A random process is a collection of random variables {X(s,t)} which are functions of a
s∈S t ∈T
real variable t(time) Here (sample space ) and t and {X(s,t)} is a real valued function.
Solution: If {X(t)} is a stationary process with the autocorrelation function R(τ ) , then the Fourier
transform of R(τ ) is caleed the power spectral density function of {X(t)} and is given by
∞
∫ R(τ )e
− iωτ
S (ω ) = dτ
−∞
Solution: Let X(t) be a real WSS process with power density spectrum S XX (ω ) . Let X T (t ) be a
portion of the process X(t) in time interval –T to T.
i.e.,
Solution: A sample function X(t) of a WSS noise random process {X(t)} is called white noise if
the power spectral density of {X(t)} is a constant at all frequencies. We denote the power
N
spectral density of white noise w(t) as S w ( f ) = 0
2
PART B
Solution:
= 1 −
x! n n
λ x λ
−x
1 2 x − 1 λ
n
Solution:
1 Γ (λ ) 1
= =
Γ(λ ) (1 − t )λ (1 − t )λ
d d
µ1′ E=
= [ X ] M X (t )=
[(1 − t ) − λ ]t = 0
dt t = 0 dt
λ (1 − t ) − λ −1 (−1)
=−
t =0
=λ
Mean = λ
d2 d
µ2′ = E[ X 2 ] = 2 M X (t ) = −λ (1 − t ) − λ −1 (−1) t = 0
dt t = 0 dt
Variance( X ) = E[ X 2 ] − [ E ( X )]2 = λ (λ + 1) − λ 2 = λ 2 + λ − λ 2 = λ
(OR)
(b)(i) Suppose that a customer arrive at a bank according to poisson process with a mean rate of
3 per minute.Find the probability that during a time interval of 2 minutes.
Solution:
=λ 3=
/ minute, t 2
e − λ t (λ t ) n
= n)
P(X(t) =
n!
e −6 (6) 4
= 4)
(1) P(X(2) = = 0.1338
4!
(2) P(X(2) > 4) =1 − (0.1338 + 0.1512) =0.715
(ii) If X and Y are independent RVs each normally distributed with mean zero and variance σ 2 .
Y
Find the pdf of R= X 2 + Y 2 & φ=
tan −1
X
Y
Given R =
X 2 +Y2 & φ=
tan −1
X
=⇒ x r cos
= θ y r sin θ
Rangespace − ∞ < x < ∞, − ∞ < y < ∞ ⇒ 0 ≤ R < ∞, 0 ≤ θ ≤ 2π
∂x ∂x
∂R ∂θ cos θ − sin θ
=J = = R
∂x ∂x sin θ R cos θ
∂R ∂θ
Now the joint pdf of R and θ is
(ii) If X and Y are independent RV’s with pdf’s e − x ; x ≥ 0 & e − y ; y ≥ 0, respectively.Find the
X
pdf’s of U = & V = X + Y . Are U and V independent?
X +Y
Solution:
Given f ( x) = e − x ; x ≥ 0 & f ( y ) = e − y ; y ≥ 0,
X
Given U = &V = X + Y.
X +Y
x = u( x + y) v= x+ y
Hence x = uv y = v(1 − u )
∂x ∂x
J = ∂u ∂v = v u
= v(1 − u ) + uv = v
∂y ∂y − v 1 − u
∂u ∂v
g (u , v) = f ( x, y ) J = e − ( x + y ) v = ve − (uv + v (1−u ) = ve − v
Range space:
The pdf of U is
∞ ∞
g u (u ) = ∫ g (u , v)dv = ∫ ve −v dv
0 0
∞
ve −v e −v
= − 2
(−1) (−1) 0
[
= − ve −v − e −v ]
∞
0 = −[0 − 1] = 1
g (u ) = 1, 0 ≤ u ≤1
The pdf of V is
1 1
g v (v) = ∫ g (u , v)du = ∫ ve −v du
0 0
= ve −v [u ]0
1
= ve −v , v ≥ 0
g (u , v) = g (u ).g (v).
(OR)
(b) The joint probability mass function of (X,Y) is given by p( x, y ) =k (2 x + 3 y ), x =0,1, 2; y =1, 2,3.
Find all the marginal and conditional probability distributions. Also find the probability
distribution of (X+Y)
Solution:
x/y 1 2 3 P(x)
0 3k 6k 9k 18k
1 5k 8k 11k 24k
2 7k 10k 13k 30k
P(y) 15k 24k 33k 72k
x/y 1 2 3 P(x)
0 3 6 9 18
72 72 72 72
1 5 8 11 24
72 72 72 72
2 7 10 13 30
72 72 72 72
P(y) 15 24 33 1
72 72 72
Y=y 1 2 3
P(Y=y) 15 24 33
72 72 72
P= (Y 1,= X 0) 3 18 1
P(Y= 1 X= 0)
= = =
P( X = 0) 72 72 6
P= (Y 2,= X 0) 6 18 1
P(Y= 2 X= 0)
= = =
P( X = 0) 72 72 3
P=(Y 3,= X 0) 9 18 1
P(Y= 3 X= 0)
= = =
P( X = 0) 72 72 2
P=(Y 1,= X 1) 5 24 5
P(Y= 1 X= 1)= = =
P ( X = 1) 72 72 24
P= (Y 2,= X 1) 8 24 1
P (Y= 2 X= 1)= = =
P ( X = 1) 72 72 3
P (Y = 3, X = 1) 11 24 11
P (Y= 3 X= 1)= = =
P( X = 1) 72 72 24
P= (Y 1,= X 2) 7 30 7
P(Y= 1 X= 2)
= = =
P( X = 2) 72 72 30
P= (Y 2,= X 2) 10 30 1
P(Y= 2 X= 2)
= = =
P( X = 2) 72 72 3
P=(Y 3,= X 2) 13 30 13
P(Y= 3 X= 2)
= = =
P( X = 2) 72 72 30
X 0 1 2
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
P=
( X x=
Y 1) 3 15 1 5 15 1 7 15 7
= = =
72 72 5 72 72 3 72 72 15
Conditional distribution of X Y = 1
Te conditional distribution of X Y = 2
X 0 1 2
P=
( X x=
Y 2) 6 24 1 8 24 1 10 24 5
= = =
72 72 4 72 72 3 72 72 12
Te conditional distribution of X Y = 3
X 0 1 2
P=
( X x=
Y 3) 9 1 13
33 3 33
Te conditional distribution of Y X = 0
X 0 1 2
P=
(Y y=
X 0) 3 18 1 6 18 1 9 18 1
= = =
72 72 6 72 72 3 72 72 2
Te conditional distribution of Y X = 1
X 0 1 2
P=
(Y y=
X 1) 5 1 11
24 3 24
Te conditional distribution of Y X = 2
X 0 1 2
P=
(Y y=
X 2) 7 1 13
30 3 30
13(a)(i) If the two random variables Ar and Br are uncorrelated with zero mean and
( Ar2 ) E=
E= ( Br2 ) σ rs
E{ X (t )} E ( A cos ωt + B sin ωt )
=
Solution: Mean == cos ωtE ( A) + sin ωtE ( B)
= cos ωt (0) + sin ωt (0) = 0
Auto correlation
= R=
XX (t1 , t 2 ) E ( X (t1 ) X (t2 )
E [ ( A cos ωt1 + B sin ωt1 )( A cos ωt2 + B sin ωt2 ) ]
=
= E ( A2 cos ωt1 cos ωt2 + AB cos ωt1 sin ωt2 + BA sin ωt1 cos ωt2 + B 2 sin ωt1 sin ωt2 )
σ 2 (cos ωt1 cos ωt2 + sin ωt1 sin ωt2 )
= since E(A 2 ) = σ 2 & E ( AB) =
E(B2 ) = 0
= σ 2 cos ω (t1 − t2 )
13(a)(ii)
If{𝑋𝑋(𝑡𝑡)} is Gaussian process with 𝜇𝜇(𝑡𝑡) = 10 and C(𝑡𝑡1 , 𝑡𝑡2 ) = 16𝑒𝑒 − |𝑡𝑡 1 −𝑡𝑡 2 | . Find the probability that
(1) 𝑋𝑋(10) ≤ 8 and (2) |𝑋𝑋(10) − 𝑋𝑋(6)| ≤ 4
Since {X(t)} is a Guassian process, any member of {X(t)} is a normal random
variable
By definition C(t 1 ,t 2 )= R(t 1 ,t 2 )-E[X(t 1 )].E[X(t 2 )]
C(t 1 ,t 2 )= Var(X(t 1 ))
C(t 1 ,t 2 )= Var {X(t)}
Now X (10) is a normal random variable with mean µ(10)=10 and variance
C(10,10) =16
(a). To find P(x(10)≤8),we have
𝑋𝑋(10)−10 8−10
P(X(10)≤8)=P� ≤ �
4 4
=P[z≤-0.5]
=0.5-P[0≤X≤0.5]
=0.5-0.1915(from normal tables)
=0.3085
(b).To find P(|𝑋𝑋(10) − 𝑋𝑋(6)| ≤ 4)let U=X(10)-X(6).
Here U is also a Random variable and we have
E(U)=E[X(10)-X(6)]
=10-10
=0
Var(U)= Var[X(10)-X(6)]
13(b)(i) Define Random telegraph signal process and prove that it is wide sense stationary.
Solution: Let {N (t ), t ≥ 0}, denote a poisson process, and let X 0 be independent of this process
1
and be such that P{ X 0 =1} =P{ X 0 = (t ) X 0 (−1) N ( t ) then { X (t ), t ≥ 0} is called
−1} = . Defining X =
2
random telegraph signal process.
1
𝑃𝑃(𝛼𝛼 = 1) = 𝑃𝑃(𝛼𝛼 = −1) =
2
By the definition 𝐸𝐸(𝛼𝛼) = 0, 𝐸𝐸(𝛼𝛼 2 ) = 1
= 0 = 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
13(b)(ii) Prove that the sum of two independent poisson process is a poisson process.
𝒏𝒏
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝑒𝑒 −⋋1 𝑡𝑡 𝑒𝑒 −⋋2 𝑡𝑡 � 𝒏𝒏!
𝒓𝒓=𝟎𝟎 𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏
−𝒕𝒕(⋋𝟏𝟏 +⋋𝟐𝟐 )
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝒆𝒆 � 𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏!
𝒓𝒓=𝟎𝟎
Hence 𝑋𝑋1 (𝑡𝑡) + 𝑋𝑋2 (𝑡𝑡) is a poisson process with parameter (⋋𝟏𝟏 +⋋𝟐𝟐 )
Solution:
∫ R(τ )e
−iωτ
S (ω ) = dτ
−∞
∞
τ
= a2 ∫ e
−2
e −iωτ dτ .
−∇
∞
βτ
∫e
−2
= a2 (cos ωτ − i sin ωτ )dτ
−∞
∞∞
∫e
− 2σ
= 2a 2 cos τdατ
0
∞
e − 4ατ
= 2a 2
(−2α cos ωτ + ω sin ωτ ) =
4α + ω
2 2
0
1
= 2a 2 − (−2α )
4α + αω
2 2
4a 2 ε
=
4α 2 + ω 2
{
R(τ ) = λ2 for τ >∈
λ τ
= λ2 + 1 − for τ ≤∈
∈ ∈
4λ sin 2 (ω ∈ 2)
S (ω ) = 2πλ2δ (ω ) +
∈2 ω 2
Solution:
2λ ∞ τ
= F (λ 2 ) + ∫ 1 − cos ωτdτ
∈ 0 ∈
∈
2 2λ τ sin ωτ 1 − cos ωτ
= F (λ ) + 1 − +
∈ ∈ ω ω ω 2
0
2λ 1 − cos ω ∈
= F (λ 2 ) +
∈ ω 2
ω ∈
4λ sin 2
2
= F (λ ) + 2
∈2 ω 2
4λ sin 2 (ω ∈ 2)
= 2πλ2δ (ω ) +
∈2 ω 2
b
(a − ω ),
S (ω ) = ω ≤a
a
= 0, ω >a
Solution:
∫ a ( a − ω ) ( cos ωτ + i sin ωτ ) dω
1 b
=
2π −a
a
2 b
=
2π ∫ a ( a − ω ) cos ωτ dω
0
a
b b
=
aπ ∫ a ( a − ω ) cos ωτ dω
0
a
b sin ωτ cos ωτ
=( a − ω ) −
aπ τ τ 2 0
b
= (1 − cos aτ )
aπτ 2
2b aτ
= sin 2
aπτ 2
2
(ii) If the process {X(t)}=Y(t)Z(t) where {Y(t)} and {Z(t)} are independent WSS processes,
prove that
Solution:
∞
S XX (ω ) = ∫R
−∞
XX (τ )e − iωτ dτ
∞
1
RXX (τ ) =
2π ∫S
−∞
XX (ω )eiωτ d ω
∞
F −1 ∫ SYY (α ) S ZZ (ω − α )eiωτ dα
−∞
Consider ∞ ∞
1
=
2π ∫ ∫S
−∞ −∞
YY (α ) S ZZ (ω − α )eiωτ dα d ω
15(a)(i)
If 𝑌𝑌(𝑡𝑡) = 𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡), where A is a constant, 𝜃𝜃 is a random variable with uniform
distribution in (−𝜋𝜋, 𝜋𝜋)and N(t) is a band-limited Gaussian white noise with a power spectral
𝑁𝑁0
𝑓𝑓𝑓𝑓𝑓𝑓 |𝜔𝜔 − 𝜔𝜔0 | < 𝜔𝜔𝐵𝐵
density 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔) = � 2 Fnd the power spectral density of Y(t). Assume that
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
N(t) and 𝜃𝜃 are independent.
𝑌𝑌(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏) = [𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)][𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
2
= 𝐴𝐴 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏) + 𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)
+ 𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)𝑁𝑁(𝑡𝑡)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐴𝐴2 E[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)] + 𝐸𝐸[𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
+ 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏)] + 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡)]
Since 𝑁𝑁(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝜃𝜃 are independent.
By hypothesis 𝐸𝐸[𝑁𝑁(𝑡𝑡)] = 0, 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏) = 0
𝐴𝐴2
∴ 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝑐𝑐𝑐𝑐𝑐𝑐𝜔𝜔0 𝜏𝜏 + 𝑅𝑅𝑁𝑁𝑁𝑁 (𝜏𝜏)
2
𝐴𝐴2 ∞
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2 −∞
𝜋𝜋𝐴𝐴2
= [𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 ) + 𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 )] + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2
Solution: Since the mean square value is always positive, PSD is also positive.
5. 15(b) If X(t) is the input voltage to a circuit (system) and Y(t) is the output
−α τ
voltage.{X(t)} is a stationary process with µ x = 0 and Rxx (τ ) = e . Find
R
µ y , S yy and Ryy (τ ) , if the power transfer function is H (ω ) = .
R + iLω
∞
S XX (ω ) = ∫R
−∞
XX (τ )e −iωτ dτ
(ii)We know that
∞
∫e
−α τ
= e −iωτ dτ
−∞
0 ∞
= ∫e e ατ −iωτ
dτ + ∫ e −ατ e −iωτ dτ
−∞ 0
0 ∞
= ∫ e (α −iω )τ dτ + ∫ e (α +iω )τ dτ
−∞ 0
0 ∞
e (α −iω )τ e − (α +iω )τ
= +
α − iω −∞ − (α + iω ) 0
1 1
= [1 − 0] − [0 − 1]
α − iω α + iω
1 1 α + iω + α − iω 2α
= + = = 2
α − iω α + iω (α − iω )(α + iω ) α + ω 2
R
Given H (ω ) = .
R + iLω
We know that S yy (ω ) = H (ω ) S xx (ω )
2
2
R2 2α
=
R + iLω α + ω 2
2
R2 2α
= . 2
R + L ω α +ω2
2 2 2
∞
1 2αR 2 e iωτ
=
2π ∫ 2 2 2 2 2 dω
.
−∞ ( R + L ω ) α + ω
∞
αR 2 1 e iωτ
=
π ∫ 2 2 2 2 2 dω
.
−∞ ( R + L ω ) α + ω
1
First we shall write ( R + L ω )(α + ω ) as partial fraction, treating
2 2
2 2 2
R2 L2
Put u = − we get A =
L2 α 2 L2 − R 2
1
Put u = −α 2 we get B = 2
R − L2α 2
L2 1
∴ 2
1
= α L − R + R − L2α 2
2 2 2 2
∞ ∞
αR 2 L2 1 αR 2 1
∫ ∫
iτω
∴ RYY (τ ) = .e dω − .e iτω dω
π (α L − R ) −∞ ( R + L ω )
2 2 2 2 2 2
π (α L − R ) −∞α + ω 2
2 2 2 2
∞ ∞
αR 2 L2 1 1 αR 2 1
∫ ) ∫α
iτω
= .e dω − .e iτω dω
π (α L − R ) L2
2 2 2
R 2
π (α L − R
2 2 2 2
+ω 2
+ω2) −∞
( −∞
L2
By Contour integration, we know that
∞
e imz π − ma
∫−∞ z 2 + a 2 dz = a e , m > 0
αLR −τ R R2 −α τ
= e − e
L
2 2
R R
L2 (α 2 − ) L2 (α 2 − )
L2 L2
2
R R
α
L −τ R
e − 2 e
L −α τ
=
L
2
R R
α2 − α2 −
L L
R
− R τ R −α τ
L
= αe L − e
L
2
R
α2 −
L
______________________________________________________________________________
𝑛𝑛
1 k 2k 3k 6k
2 2k 4k 6k 12k
3 3k 6k 9k 18k
𝑝𝑝𝑋𝑋 (𝑥𝑥) 6k 12k 18k 36k
53
∴ 𝑃𝑃(𝑋𝑋 < 𝑌𝑌) =
480
5. Define wide sense stationary process.
Solution: A random process {𝑋𝑋(𝑡𝑡)} is called wide sense stationary process if its mean is
constant and autocorrelation function depends only on the time difference 𝜏𝜏.
i.e.,
(i) 𝐸𝐸{𝑋𝑋(𝑡𝑡)} is always a constant.
(ii) 𝐸𝐸(𝑋𝑋(𝑡𝑡1 ). 𝑋𝑋(𝑡𝑡2 ) = 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏), where 𝜏𝜏 = 𝑡𝑡2 − 𝑡𝑡1.
6. Show that a binomial process is Markov.
Solution:
𝑆𝑆𝑛𝑛 = 𝑋𝑋1 + 𝑋𝑋2 + ⋯ + 𝑋𝑋𝑛𝑛 = 𝑆𝑆𝑛𝑛 + 𝑥𝑥𝑛𝑛
𝑃𝑃(𝑆𝑆𝑛𝑛 = 𝑚𝑚⁄𝑆𝑆𝑛𝑛−1 = 𝑚𝑚) = 𝑃𝑃(𝑥𝑥𝑛𝑛 = 0) = 1 − 𝑃𝑃
𝑃𝑃(𝑆𝑆𝑛𝑛 = 𝑚𝑚⁄𝑆𝑆𝑛𝑛−1 = 𝑚𝑚 − 1) = 𝑃𝑃(𝑥𝑥𝑛𝑛 = 0) = 𝑃𝑃
The probability distribution of 𝑆𝑆𝑛𝑛 depends only on 𝑆𝑆𝑛𝑛−1 .
Hence the binomial process is Markov.
7. A random process 𝑋𝑋(𝑡𝑡) is defined by 𝑋𝑋(𝑡𝑡) = 𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾, 𝑡𝑡 ≥ 0 where 𝜔𝜔 is a constant and 𝐾𝐾
is uniformly distributed over (0,2). Find the autocorrelation of 𝑋𝑋(𝑡𝑡).
Solution:
Given 𝑋𝑋(𝑡𝑡) = 𝑋𝑋(𝑡𝑡) = 𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾 and 𝐾𝐾 is uniformly distributed in (0,2).
Hence the pdf of 𝐾𝐾 is 𝑓𝑓𝑘𝑘 (𝑘𝑘), 0 < 𝑘𝑘 < 2.
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐸𝐸[𝑋𝑋(𝑡𝑡) + 𝑋𝑋(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾. 𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾𝐾(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[𝐾𝐾 2 ][𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)]
2
= � 𝐾𝐾 2 𝑓𝑓𝐾𝐾 (𝑘𝑘)𝑑𝑑𝑑𝑑. [𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)]
0
Agni college of Technology
Chennai – 130
2
1 𝑘𝑘 3
= � � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)
2 3 0
1 8
= � � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)
2 3
4
= [𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)]
3
8. Define cross correlation function of X(t) and Y(t). When do you say that they are
independent?
Solution: The cross correlation of the two processes X(t) and Y(t) is defined as
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 𝐸𝐸[𝑋𝑋(𝑡𝑡1 ). 𝑌𝑌(𝑡𝑡2 )]. X(t) and Y(t) are independent if autocovariance
is zero
9. Define linear time invariant.
Solution: A general linear system is said to be time-invariant if the input X(t) is time shifted by
an amount h, the corresponding output Y(t) will also be time shifted by the same amount.
i.e., If Y(t+h)=F(X(t+h)), where Y(t)=F(X(t))
F is called a time invariaYnt system or X(t) and Y(t) are said to form a time invariant system.
10.State the convolution form of the output of a linear time invariant system.
Solution: Let X(t) be a WSS random input process to linear time-invariant system with unit
impulse response h(t) and let Y(t) be the corresponding output process, then
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = ℎ(𝜏𝜏) ∗ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = ℎ(−𝜏𝜏) ∗ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = ℎ(𝜔𝜔) ∗ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = ℎ(𝜔𝜔) ∗ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)
Where * denotes the convolution.
Hence mean of X =3
2
Variance of X = 𝜇𝜇2′ − 𝜇𝜇1′ = 12 − 32 = 3.
(ii) A random variable X is uniformly distributed over (0,10). Find
(1) 𝑃𝑃(𝑋𝑋 < 3), 𝑃𝑃(𝑋𝑋 > 7)𝑎𝑎𝑎𝑎𝑎𝑎 𝑃𝑃(2 < 𝑋𝑋 < 5) (2)𝑃𝑃(𝑋𝑋 = 7).
Solution: X is uniformly distributed over (0,10).
1
, 0 < 𝑥𝑥 < 10
Hence pdf is 𝑓𝑓(𝑥𝑥) = �10
0 , 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
3 1 1 3
(1) 𝑃𝑃(𝑋𝑋 < 3) = ∫0 𝑑𝑑𝑑𝑑 = [𝑥𝑥]30 =
10 10 10
10 1 1 3
(2) 𝑃𝑃(𝑋𝑋 > 7) = ∫7 𝑑𝑑𝑑𝑑 = [𝑥𝑥]10
7 =
10 10 10
5 1 1 5 3
(3) 𝑃𝑃(2 < 𝑥𝑥 < 5) = ∫2 𝑑𝑑𝑑𝑑 = (𝑥𝑥)2 =
10 10 10
(4) Since X is a continuous random variable, 𝑃𝑃(𝑋𝑋 = 7) = 0.
(OR)
(b)(i) An office has four phone lines.Each is busy about 10% of the time. Assume that the phone
lines act independently.
(1) What is the probability that all four phones are busy?
(2) What is the probability that atleast two of them are busy?
Solution: Using Geometric distribution with 𝑥𝑥 = 4 & 𝑝𝑝 = 0.1, 𝑞𝑞 = 0.9
𝑃𝑃[𝑋𝑋 = 𝑥𝑥] = 𝑝𝑝𝑞𝑞 𝑥𝑥−1 , ⋋= 1 − 𝑃𝑃[𝑋𝑋 = 1]
= 1 − (0.1)(0.9)1−1 = 1 − 0.1 = 0.9.
11(b)(ii) Describe Gamma distribution. Obtain its moment generating function.Hence compute
its mean and variance.
Agni college of Technology
Chennai – 130
Solution: The moment generating function of a random variable X defined as 𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸[𝑒𝑒 𝑡𝑡𝑡𝑡 ] =
∑∞𝑖𝑖=1 𝑒𝑒
𝑡𝑡𝑥𝑥 𝑖𝑖
𝑃𝑃(𝑥𝑥𝑖𝑖 ) 𝑖𝑖𝑖𝑖 𝑋𝑋 𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑅𝑅𝑅𝑅
� ∞
∫−∞ 𝑓𝑓(𝑥𝑥)𝑒𝑒 𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑 𝑖𝑖𝑖𝑖 𝑋𝑋 𝑖𝑖𝑖𝑖 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑅𝑅𝑅𝑅
∞ ∝−1
𝑡𝑡𝑡𝑡
𝜆𝜆 𝑒𝑒 − 𝜆𝜆𝜆𝜆 ( 𝜆𝜆𝜆𝜆 )
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
0 Γ𝛼𝛼
𝜆𝜆∝ ∞ − (𝜆𝜆−𝑡𝑡)𝑥𝑥 ( 𝜆𝜆𝜆𝜆 )∝−1
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
Γ𝛼𝛼 0
Put 𝑢𝑢 = (𝜆𝜆 − 𝑡𝑡)𝑥𝑥
𝑑𝑑𝑑𝑑 = (𝜆𝜆 − 𝑡𝑡)𝑑𝑑𝑑𝑑
𝑥𝑥 → 0 ⟹ 𝑢𝑢 → 0
𝑥𝑥 → ∞ ⟹ 𝑢𝑢 → ∞
∞ ∝−1
𝜆𝜆∝ ∫0 𝑒𝑒 −𝑢𝑢 ( 𝑢𝑢)
∴ 𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝑑𝑑𝑑𝑑
Γ𝛼𝛼 (𝜆𝜆 − 𝑡𝑡)𝛼𝛼
𝜆𝜆∝
= Γ𝛼𝛼
Γ𝛼𝛼(𝜆𝜆 − 𝑡𝑡)𝛼𝛼
𝜆𝜆 𝛼𝛼 𝑡𝑡 − 𝛼𝛼
=� � = �1 − �
𝜆𝜆 − 𝑡𝑡 𝜆𝜆
′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−1 1
𝑀𝑀𝑋𝑋 = −𝛼𝛼 �1 − � �− ��
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼
𝜇𝜇′1 =
𝜆𝜆
Agni college of Technology
Chennai – 130
′′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−2 1 2
𝑀𝑀𝑋𝑋 = 𝛼𝛼[−(∝ +1)] �1 − � � � �
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼(∝ +1)
𝜇𝜇′2 =
𝜆𝜆2
2
𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 (𝑋𝑋) = 𝜇𝜇′2 − �𝜇𝜇′1 �
𝛼𝛼(∝ +1) 𝛼𝛼 2
= − � �
𝜆𝜆2 𝜆𝜆
𝛼𝛼 2 +∝ −𝛼𝛼 2
=
𝜆𝜆2
∝
∴ 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉(𝑋𝑋) =
𝜆𝜆2
12(a)(i) Two independent random variables X and Y are defined by
4𝑎𝑎𝑎𝑎 ∶ 0 < 𝑥𝑥 < 1
𝑓𝑓𝑋𝑋 (𝑥𝑥) = � and
0 ∶ 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
4𝑏𝑏𝑏𝑏 ∶ 0 < 𝑦𝑦 < 1
𝑓𝑓𝑌𝑌 (𝑦𝑦) = � 𝑎𝑎
0 ∶ 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
Show that 𝑈𝑈 = 𝑋𝑋 + 𝑌𝑌 & 𝑉𝑉 = 𝑋𝑋 − 𝑌𝑌 are uncorrelated.
Solution: First we find a and b in the p.d.f’s.
From the property of pdf we have,
∞
� 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = 1.
−∞
1 1
𝑥𝑥 2 1
� 4𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎 = 1 ⟹ 4𝑎𝑎 � � = 1 ⟹ 2𝑎𝑎 = 1 ⟹ 𝑎𝑎 =
0 2 0 2
1 1
𝑦𝑦 2 1
� 4𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 = 1 ⟹ 4𝑏𝑏 � � = 1 ⟹ 2𝑏𝑏 = 1 ⟹ 𝑏𝑏 =
0 2 0 2
1 1 1
2] 2
𝑥𝑥 4 2 1 2
𝐸𝐸[𝑋𝑋 = � 𝑥𝑥 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = � 2𝑥𝑥𝑥𝑥 𝑑𝑑𝑑𝑑 = 2 � � = =
0 0 4 0 4 2
1 2 22 1 4 1
𝜎𝜎𝑥𝑥2 = 𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋] = 𝐸𝐸[𝑋𝑋 2]
− �𝐸𝐸[𝑋𝑋]� = − � � = − =
2 3 2 9 18
1
Similarly 𝜎𝜎𝑦𝑦2 = 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 =
18
1 1
−
Hence by equation (1) 𝑟𝑟(𝑈𝑈𝑈𝑈) = 18 18
1 1 = 0.
+
18 18
12(a)(ii) State and prove central liit theorem in the case of two dimensional random variables.
Solution: Out of syllabus.
(OR)
(b)(i) The equations of two regression lines are 3𝑥𝑥 + 12𝑦𝑦 = 19 & 3𝑦𝑦 + 9𝑥𝑥 = 46.
Find 𝑥𝑥̅ &𝑦𝑦� and the correlation coefficient between X and Y.
Solution: (i) Let the regression line of Y on X be 3𝑥𝑥 + 12𝑦𝑦 = 19 ⟹
19 1
12𝑦𝑦 = 19 − 3𝑥𝑥 ⟹ 𝑦𝑦 = − 𝑥𝑥.
12 4
1 1 1 1
Hence 𝑟𝑟 2 = �𝑏𝑏𝑦𝑦𝑦𝑦 ��𝑏𝑏𝑥𝑥𝑥𝑥 � = �− � �− � = ⇒ 𝑟𝑟 = ±
4 3 12 2√3
Agni college of Technology
Chennai – 130
(ii) To find the mean of X and Y.
Both the regression lines passes through (𝑥𝑥̅ , 𝑦𝑦�)
1
3𝑥𝑥̅ + 12𝑦𝑦� = 19 & 3𝑦𝑦� + 9𝑥𝑥̅ = 46. Solving, 𝑥𝑥̅ = 5 &𝑦𝑦� =
3
2
𝑥𝑥 3 33
𝑥𝑥 3
𝐶𝐶 � [�𝑥𝑥 − � − (−𝑥𝑥 − )]𝑑𝑑𝑑𝑑 = 1 ⟹
𝑥𝑥=0 2 2
2 2
3
2𝑥𝑥 4 1
𝐶𝐶 � 2𝑥𝑥 𝑑𝑑𝑑𝑑 = 1 ⟹ 𝐶𝐶 � � ⟹ 𝐶𝐶 =
0 4 0 8
1
𝑓𝑓𝑋𝑋,𝑌𝑌 (𝑥𝑥, 𝑦𝑦) = �8 𝑥𝑥(𝑥𝑥 − 𝑦𝑦): 0 < 𝑥𝑥 < 2, −𝑥𝑥 < 𝑦𝑦 < 𝑥𝑥
0 ∶ 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
𝑥𝑥 1 1 𝑥𝑥
(ii) 𝑓𝑓𝑥𝑥 (𝑥𝑥) = ∫−𝑥𝑥 𝑥𝑥(𝑥𝑥 − 𝑦𝑦)𝑑𝑑𝑑𝑑 = ∫−𝑥𝑥 (𝑥𝑥 2 − 𝑥𝑥𝑥𝑥) 𝑑𝑑𝑑𝑑 =
8 8
𝑥𝑥
1 2 𝑥𝑥𝑦𝑦 2 𝑥𝑥 3 𝑥𝑥 3 1
�𝑥𝑥 𝑦𝑦 − � = ��𝑥𝑥 3 − � − �−𝑥𝑥 3 − �� = 𝑥𝑥 3 ; 0 < 𝑥𝑥 < 2.
8 2 −𝑥𝑥 2 2 4
1
𝑓𝑓(𝑥𝑥,𝑦𝑦) 𝑥𝑥(𝑥𝑥−𝑦𝑦) 1
(iii)𝑓𝑓𝑌𝑌 ⁄𝑋𝑋 (𝑦𝑦⁄𝑥𝑥) = = 8
𝑥𝑥 3
= (𝑥𝑥 − 𝑦𝑦); −𝑥𝑥 < 𝑦𝑦 < 𝑥𝑥.
𝑓𝑓 𝑋𝑋 (𝑥𝑥) 2𝑥𝑥 2
4
13(a) (i) Define a semi random telegraph signal process and prove that it
is evolutionary.
𝑒𝑒 −⋋𝑡𝑡 (⋋𝑡𝑡)𝑛𝑛
{N(t)} is a process with 𝑃𝑃{𝑁𝑁(𝑡𝑡) = 𝑟𝑟} = , 𝑛𝑛 = 𝑜𝑜, 1,2 …
𝑛𝑛!
= 𝑒𝑒 −⋋𝑡𝑡 𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝑡𝑡
= 𝑒𝑒 −⋋𝑡𝑡 𝑠𝑠𝑠𝑠𝑠𝑠ℎ ⋋ 𝑡𝑡
−⋋𝑡𝑡 −⋋𝑡𝑡
𝑒𝑒 ⋋𝑡𝑡 + 𝑒𝑒 −⋋𝑡𝑡 𝑒𝑒 ⋋𝑡𝑡 − 𝑒𝑒 −⋋𝑡𝑡
= 𝑒𝑒 [𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝑡𝑡 − 𝑠𝑠𝑠𝑠𝑠𝑠ℎ ⋋ 𝑡𝑡 = 𝑒𝑒 � − �
2 2
So {X(t)} is evolutionary.
(ii) Mention any three properties each of auto correlation and of cross correlation
Solution: Properties of auto correlation: Let X(t) be a WSS process. Then the auto correlation
function 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) is a function of time difference 𝜏𝜏 only.It
(1) 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝑅𝑅𝑋𝑋𝑋𝑋 (−𝜏𝜏). (i.e., autocorrelation function is an even function)
(2) |𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)| ≤ 𝑅𝑅𝑋𝑋𝑋𝑋 (0). (i.e., Max. value of 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) is 𝑅𝑅𝑋𝑋𝑋𝑋 (0)).
(3) If the process X(t) contains a periodic component, then 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) will also
(2) If the random process X(t) and Y(t) are independent, then
(OR)
(b)(i) A random process X(t) definded by 𝑋𝑋(𝑡𝑡) = 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵; −∞ < 𝑥𝑥 < ∞ where A and B
are independent random variables each of which has a value -2 with probability 1/3 and a value 1
with probability 2/3. Show that X(t) is a wide sense stationary process.
A; -2 1
1 2 2 2
𝐸𝐸[𝐴𝐴] = � 𝑎𝑎𝑖𝑖 𝑝𝑝(𝑎𝑎𝑖𝑖 ) = (−2) � � + (1) � � = − + = 0.
3 3 3 3
1 2 4 2
𝐸𝐸[𝐴𝐴2 ] = � 𝑎𝑎𝑖𝑖2 𝑝𝑝(𝑎𝑎𝑖𝑖 ) = (−2)2 � � + (1)2 � � = + = 2.
3 3 3 3
1 2 2 2
𝐸𝐸[𝐵𝐵] = ∑ 𝑏𝑏𝑖𝑖 𝑝𝑝(𝑏𝑏𝑖𝑖 ) = (−2) � � + (1) � � = − + = 0
3 3 3 3
1 2 4 2
𝐸𝐸[𝐵𝐵2 ] = � 𝑏𝑏𝑖𝑖2 𝑝𝑝(𝑏𝑏𝑖𝑖 ) = (−2)2 � � + (1)2 � � = + = 2.
3 3 3 3
(ii) Define a poisson process.Show that the sum of two poisson processes is a poisson process.
Solution: If X(t) represents the number of occurences of a certain event in (0,t) then the discrete
random process {X(t)} is called the poisson process, provided the following postulates are
satisfied.
4. 𝑋𝑋(𝑡𝑡) is independent of the number of occurences of the event in any interval before and after
interval (0,t).
5. The probability that an event occurs a specified number of times in (𝑡𝑡0 , 𝑡𝑡0 + 𝜏𝜏) depends only
on 𝜏𝜏 and not on 𝑡𝑡0
𝒏𝒏
−⋋1 𝑡𝑡 −⋋2 𝑡𝑡
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝑒𝑒 𝑒𝑒 � 𝒏𝒏!
𝒓𝒓=𝟎𝟎 𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏
−𝒕𝒕(⋋𝟏𝟏 +⋋𝟐𝟐 )
(⋋1 𝑡𝑡)𝒓𝒓 (⋋2 𝑡𝑡)𝒏𝒏−𝒓𝒓
= 𝒆𝒆 � 𝒏𝒏𝒄𝒄𝒓𝒓
𝒏𝒏!
𝒓𝒓=𝟎𝟎
Hence 𝑋𝑋1 (𝑡𝑡) + 𝑋𝑋2 (𝑡𝑡) is a poisson process with parameter (⋋𝟏𝟏 +⋋𝟐𝟐 )𝒕𝒕
Agni college of Technology
Chennai – 130
14(a)(i) Define spectral density of a stationary random process X(t).Prove that for
a real random process X(t) the power spectral density is an even function.
Solution: The power spectral density of 𝑆𝑆𝑋𝑋 (𝜔𝜔) of a continuous random process X(t) is defined as
Fourier Transform of 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏):
To prove that the spectral density function of a real random process is an even function.
∞
S XX (ω ) = ∫ R (τ )e−iωτ dτ
−∞ XX
∞ iωτ dτ
S XX (−ω ) = ∫ RXX (τ )e
−∞
Letτ = −u τ → −∞ ⇒ u → ∞
dτ = −du τ → ∞ ⇒ u → −∞
∞ iω (−u ) (−du )
(−ω )
S XX=∫ RXX (− u)e
−∞
∞
=− ∫ R (− u)eiω (−u ) (du )
−∞ XX
∞
= ∫ R (− u)eiω (−u ) (−du )
−∞ XX
∞ iω (−τ ) (dτ )
= ∫ RXX (−τ )e
−∞
=R (−τ )
XX
Since R = (τ ) R (−τ )
XX XX
∞
(1) ⇒ S (−ω ) ∫ R (−τ )eiω (−τ ) (dτ ) = S XX (ω )
=
XX −∞ XX
(ii) Two random processes X(t) and Y(t) are defined as follows:
Solution:
Agni college of Technology
Chennai – 130
t + τ ) E[X(t) Y(t + τ )
(t,=
R
XY
= A 2 E[cos(ωt + θ )sin(ωt + ωτ + θ )]
A2
= E[sin(2ω t + ωτ + 2θ )cos(ωt + θ )]
2
A2
= E[sin(2ωt + ωτ + 2θ ) + sin ωτ ]
2
A2 A2
= E[sin ωτ ] + E[sin(2ω t + ωτ + 2θ )]
2 2
2π
A2 A2
sin(2ω t + ωτ + 2θ )dθ
4π ∫0
= sin ωτ +
2
A2 A2
= sin ωτ
= +0 sin ωτ
2 2
Soltion:
Statement:
Let X(t) be a real WSS process with power density spectrum S XX (ω ) . Let X T (t ) be a
X (t ) ,−T < t < T
X T (t ) =
portion of the process X(t) in time interval –T to T. i.e., 0 , elsewhere
Let X T (ω ) be the Fourier transform of X T (t ) , then
S XX (ω ) =
lim 1
T → ∞ 2T
E X T (ω )
2
{ }
Proof: Given X T (ω ) is the Fourier transorm of X T (t )
∞
∴ X T (ω ) = ∫X
−∞
T (t )e −iωt dt
T
= ∫X
−T
T (t )e −iωt dt
∫ X (t )e
−iωt
= dt
−T
T T
∫ X (t )e dt. ∫ X (t )e −iωt dt
i ωt
=
−T −T [ X (t ) is real]
Agni college of Technology
Chennai – 130
T iω t T − i ωt
= ∫ X (t )e 1 dt . ∫ X (t )e 2 dt
1 1 2 2
−T −T
T T
= ∫ ∫ X (t ) X (t
−T −T
1 2 )e −iω ( t2 −t2 ) dt1dt 2
T T
1 ∫ ∫ XX 1 2
R (t , t )e −iω ( t2 −t2 ) dt1dt 2
∴
lim
T →∞
[
E X T (ω ) =
2
]
T → ∞ 2T
lim
−T −T
∂t1 ∂t1 1 0
= =1
J= ∂ t ∂τ 1 1
∂t 2 ∂t 2
∂t ∂τ
dt1 dt 2 = J dtdτ
The limits of t and –T and T
When t 2 = −T ,τ = −T − t and t 2 = T ,τ = T − t
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (t , t + τ )e −iωτ dtdτ
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫
R (τ )e −iωτ dtdτ
−T −t −T
XX
lim 1
= T −t T
T → ∞ 2T
∫ R XX (τ )e dτ . ∫ dt
−iωτ
−T −t −T
lim 1
= T −t
T → ∞ 2T
∫R
−T −t
XX (τ )e −iωτ dτ .2T
Agni college of Technology
Chennai – 130
lim
= T −t
T →∞
∫R
−T −t
XX (τ )e −iωτ dτ
∞
= ∫R
−∞
XX (τ )e −iωτ dτ = S XX (ω )
, by definition.
∴ S XX (ω ) =lim E X T (ω ) [ 2
]
T →∞ 2T
Hence proved.
ibω
a + − a < ω < a, a > 0
S XY ω ) = a
(ii) If the cross power spectral density of X(t) ad Y(t) is o otherwise
1 a ibω iωτ
= ∫ (a + )e dω
2π −a a
a
15(a)(i) A random process X(t) is the input to a linear system whose impulse function is
−2 τ
h(t ) = 2e − t , t ≥ 0. The auto correlation function of the process is R XX (τ ) = e . Find the power
spectral density of the output process Y(t).
Agni college of Technology
Chennai – 130
Solution: Given h(t ) = 2e − t , t ≥ 0.
∫ h(t )e
− i ωt
H (ω ) = dt
−∞
∞
= ∫ 2e −t e −iωt dt
0
∞
= 22 ∫ e −(1+iωt ) dt
0
e −(1+iωt )
= 2[ ]
− (1 + iωt )
2
=
(1 + iω )
4
H (ω ) =
1 + w2
4
PDS of input X (t ) = FT [ R XX (τ )] = FT [e − 2 τ =
4 + w2
16
= 2
(ω + 1)(ω 2 + 4)
(ii) A wide sense stationary noise process N(t) has an autocorrelation function RNN (τ ) = Pe−3 τ
∫ R(τ )e
− iωτ
(ω ) S=
S XX= (ω ) dτ
−∞
∞
= P∫ e
−3 τ
(cos ωτ − i sin ωτ )dτ
−∞
∞
= 2P ∫ e
−3 τ
(cos ωτ )dτ
−∞
∞
= 2 P ∫ e −3τ (cos ωτ )dτ
0
∞
e−3τ (−3cos ωτ +ω sin ωτ )
= 2P
9 +ω 2 0
2P
= (3)
9 + ω2
6P
=
9 + ω2
15(b)(i)If the input to a time invariant stable, linear system is a wide sense stationary
process,prove that the output will also be q wide sense stationary process.
Solution:
Sol: Let X(t) be a WSS process for a linear time invariant stable system with Y(t) as the output
process.
∞
Y (t ) = ∫ h(u ) X (t − u )du
−∞
Then where h(t ) is weighting function or unit impulse response.
∞
∴ E [Y (t )] = ∫ E[h(u ) X (t − u )]du
−∞
∞
= ∫ h(u ) E[ X (t − u )]du
−∞
∴ E[ X (t − u )] = µ X
∞ ∞
∴ E [Y (t )] = ∫ h(u )µ X du = µ X ∫ h(u )du
−∞ −∞
∞
∫ h(u )du
Since the system is stable , −∞ is finite
Agni college of Technology
Chennai – 130
∴ E [Y (t )] is a constant.
Since X(t) is a WSS process, auto correlation function is only a function of time
difference
∞ ∞
∴ RYY (t , t + τ ) = ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) R XX (τ + u1 − u 2 )du1 du 2
15(b)(ii) Let X(t) be a wide sense stationary process which is the input to a linear time
Invariant system with unit impulse h(t) and output Y(t), then prove that
SYY (ω ) = H (ω ) S XX (ω ) where H (ω ) is
2
Fourier transform of h(t).
∞
Solution: =
Let Y (t )
−∞
∫ h(u ) X (t − u )du
∞
=
Y (t ) ∫ X (t − α )h(α )dα
−∞
∞
∴ X (t + τ )Y (t ) = ∫ X (t + τ ) X (t − α )h(α )dα
−∞
∞
E[ X (t + τ )Y (t )]= ∫ E{ X (t + τ ) X (t − α )}h(α )dα
−∞
Hence ∞
= ∫R
−∞
XX (τ + α )h(α )dα
∞
= ∫R
−∞
XX (τ − β )h(− β )d β
∞
E{Y (t )Y (t − τ=
)} ∫R
−∞
XY (t − α )h(α )dα
𝑒𝑒 − λ 𝑥𝑥
=∑∞
𝑥𝑥=0 𝑒𝑒
𝑡𝑡𝑡𝑡
λ
𝑥𝑥!
𝑥𝑥
� λ et �
=𝑒𝑒 − λ ∑∞
𝑥𝑥=0 𝑥𝑥!
2
λ et � λ et �
=𝑒𝑒 − λ �1 + + + ⋯.�
1! 2!
𝑡𝑡
= 𝑒𝑒 − λ 𝑒𝑒 − λ e
M X (t)= 𝑒𝑒 λ (e
t −1)
(2) |𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)| ≤ 𝑅𝑅𝑋𝑋𝑋𝑋 (0). (i.e., Max. value of 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)is 𝑅𝑅𝑋𝑋𝑋𝑋 (0)).
S XX (ω ) =
lim
1
T → ∞ 2T
{
E X T (ω )
2
}
9. Define white noise
Ans: A sample function X(t) of a WSS noise random process {X(t)} is called white
noise if the power spectral density of {X(t)} is a constant at all frequencies.
N
We denote the power spectral density of white noise w(t) as S w ( f ) = 0
2
10. The autocorrelation function for a stationary ergodic with no periodic components is
4
{𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)} = 25 + 2 . Find the mena and variance of the process
{𝑋𝑋(𝑡𝑡)}.
1+6𝜏𝜏
Ans:
𝐸𝐸�𝑋𝑋(𝑡𝑡)� = 𝜇𝜇𝑋𝑋 = � lim 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)
𝜏𝜏→∞
4
= � lim 25 + = √25 = 5
𝜏𝜏→∞ 1 + 6𝜏𝜏 2
𝐸𝐸�𝑋𝑋 2 (𝑡𝑡)� = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
= 25 + 4 = 29
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸�𝑋𝑋 2 (𝑡𝑡)� − 𝐸𝐸(𝑋𝑋(𝑡𝑡))
29 − 52 = 4
Part - B
11. (a) (i) th
Find the n moment about mean of normal distribution.
Ans: Central moments of normal distribution 𝑁𝑁(𝜇𝜇, 𝜎𝜎)
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
Central moments 𝜇𝜇𝑟𝑟 of 𝑁𝑁(𝜇𝜇, 𝜎𝜎) are given by 𝜇𝜇𝑟𝑟 = 𝐸𝐸(𝑋𝑋 − 𝜇𝜇)𝑟𝑟
∞
1 (𝑥𝑥−𝜇𝜇 )2
𝑟𝑟 − 2𝜎𝜎 2
= �(𝑋𝑋 − 𝜇𝜇) 𝑒𝑒 𝑑𝑑𝑑𝑑
𝜎𝜎√2𝜋𝜋
−∞
(𝑥𝑥−𝜇𝜇 )
Put 𝑡𝑡 =
√2𝜎𝜎
⇒ 𝑥𝑥 − 𝜇𝜇 = 𝑡𝑡√2𝜎𝜎
𝑑𝑑𝑑𝑑 = √2𝜎𝜎𝑑𝑑𝑑𝑑
∞
1 2
= �(√2𝜎𝜎𝜎𝜎)𝑟𝑟 𝑒𝑒 − 𝑡𝑡 √2𝜎𝜎𝑑𝑑𝑑𝑑
𝜎𝜎√2𝜋𝜋
−∞
∞
1 2
= �(√2𝜎𝜎𝜎𝜎)𝑟𝑟 𝑒𝑒 − 𝑡𝑡 𝑑𝑑𝑑𝑑
√𝜋𝜋
−∞
∞
𝑟𝑟/2 𝑟𝑟
2 𝜎𝜎 2
= � 𝑡𝑡 𝑟𝑟 𝑒𝑒 − 𝑡𝑡 𝑑𝑑𝑑𝑑
√𝜋𝜋
−∞
Case: (i) r is an odd integer i.e., 𝑟𝑟 = 2𝑛𝑛 + 1
∞
2(2𝑛𝑛+1)/2 𝜎𝜎 2𝑛𝑛+1 2
𝜇𝜇2𝑛𝑛+1 = � 𝑡𝑡 2𝑛𝑛+1 𝑒𝑒 − 𝑡𝑡 𝑑𝑑𝑑𝑑
√𝜋𝜋
−∞
= 0 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑡𝑡ℎ𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑖𝑖𝑖𝑖 𝑎𝑎𝑎𝑎 𝑜𝑜𝑜𝑜𝑜𝑜 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝑡𝑡
2𝑛𝑛 𝜎𝜎 2𝑛𝑛 1
= Γ �𝑛𝑛 + � ---------------------------------------(1)
√𝜋𝜋 2
𝑛𝑛 2𝑛𝑛
2 𝜎𝜎 2𝑛𝑛 − 1 2𝑛𝑛 − 1
= � �Γ� �
√𝜋𝜋 2 2
2𝑛𝑛 𝜎𝜎 2𝑛𝑛 2𝑛𝑛 − 1 2𝑛𝑛 − 3 2𝑛𝑛 − 3
= � �� �� �
√𝜋𝜋 2 2 2
2𝑛𝑛 𝜎𝜎 2𝑛𝑛 2𝑛𝑛 − 1 2𝑛𝑛 − 3 1
= � �� �…Γ� �
√𝜋𝜋 2 2 2
= 1.3.5 … (2𝑛𝑛 − 1)𝜎𝜎 2𝑛𝑛
From (1) we get
2𝑛𝑛 −1 𝜎𝜎 2𝑛𝑛 −2 1
𝜇𝜇2𝑛𝑛−2 = Γ �𝑛𝑛 − � -----------------------------(2)
√𝜋𝜋 2
From (1) and (2) we get
𝜇𝜇2𝑛𝑛 1
= 2𝜎𝜎 2 �𝑛𝑛 − �
𝜇𝜇2𝑛𝑛−2 2
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
= 1 −
x! n n
λ x λ
−x
1 2 x − 1 λ
n
∞ ∝−1
𝜆𝜆∝ ∫0 𝑒𝑒 −𝑢𝑢 ( 𝑢𝑢)
∴ 𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝑑𝑑𝑑𝑑
Γ𝛼𝛼 (𝜆𝜆 − 𝑡𝑡)𝛼𝛼
∝
𝜆𝜆
= Γ𝛼𝛼
Γ𝛼𝛼(𝜆𝜆 − 𝑡𝑡)𝛼𝛼
𝜆𝜆 𝛼𝛼 𝑡𝑡 − 𝛼𝛼
=� � = �1 − �
𝜆𝜆 − 𝑡𝑡 𝜆𝜆
′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−1 1
𝑀𝑀𝑋𝑋 = −𝛼𝛼 �1 − � �− ��
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼
𝜇𝜇′1 =
𝜆𝜆
′′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−2 1 2
𝑀𝑀𝑋𝑋 = 𝛼𝛼[−(∝ +1)] �1 − � � � �
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼(∝ +1)
𝜇𝜇′2 =
𝜆𝜆2
2
𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 (𝑋𝑋) = 𝜇𝜇′2 − �𝜇𝜇′1 �
𝛼𝛼(∝ +1) 𝛼𝛼 2
= − � �
𝜆𝜆2 𝜆𝜆
𝛼𝛼 2 +∝ −𝛼𝛼 2
=
𝜆𝜆2
∝
∴ 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉(𝑋𝑋) = 2
𝜆𝜆
(ii) 2𝑒𝑒 − 2𝑥𝑥 , 𝑥𝑥 ≥ 0
A random variable X has the pdf 𝑓𝑓(𝑥𝑥) = �
0, 𝑥𝑥 < 0
Obtain the MGF and first four moments about the origin. Find the mean and variance.
Ans: 𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸[𝑒𝑒 𝑡𝑡𝑡𝑡 ]
∞
= � 𝑓𝑓(𝑥𝑥)𝑒𝑒 𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑
−∞
∞
= � 2𝑒𝑒 −2𝑥𝑥 𝑒𝑒 𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑
0
∞
= 2 � 𝑒𝑒 −(2−𝑡𝑡)𝑥𝑥 𝑑𝑑𝑑𝑑
0
∞
𝑒𝑒 −(2−𝑡𝑡)𝑥𝑥 1
= 2� � = 2 �0 − �
−(2 − 𝑡𝑡) 0 −(2 − 𝑡𝑡)
2
𝑀𝑀𝑋𝑋 (𝑡𝑡) =
2 − 𝑡𝑡
2 𝑡𝑡 −1
= �1 − �
2 2
𝑡𝑡 𝑡𝑡 2 𝑡𝑡 3 𝑡𝑡 4
=1+ +� � +� � +� � +⋯
2 2 2 2
1 2 1
𝜇𝜇′1 = [𝑀𝑀′𝑋𝑋 (𝑡𝑡)]𝑡𝑡=0 = , 𝜇𝜇′2 = [𝑀𝑀′′𝑋𝑋 (𝑡𝑡)]𝑡𝑡=0 = =
2 4 2
6 3 24 3
𝜇𝜇′3 = [𝑀𝑀′′′𝑋𝑋 (𝑡𝑡)]𝑡𝑡=0 = = 𝜇𝜇′4 = �𝑀𝑀𝑖𝑖𝑖𝑖 𝑋𝑋 (𝑡𝑡)�𝑡𝑡=0 = =
8 4 16 2
1 2
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 = 𝜇𝜇′1 = 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 = 𝜇𝜇′2 − �𝜇𝜇′1 �
2
1 1 2 1
= −� � =
2 2 4
12. (a) The joint probability mass function of (X,Y) is given by 𝑃𝑃(𝑥𝑥, 𝑦𝑦) = 𝑘𝑘(2𝑥𝑥 + 3𝑦𝑦), 𝑥𝑥 =
0,1,2 𝑎𝑎𝑎𝑎𝑎𝑎 𝑦𝑦 = 1,2,3. Find K and all the marginal and conditional probability
distributions. Also find the probability distribution of (𝑋𝑋 + 𝑌𝑌).
Ans:
The JPMF of (X,Y) is
x/y 1 2 3 P(x)
0 3k 6k 9k 18k
1 5k 8k 11k 24k
2 7k 10k 13k 30k
P(y) 15k 24k 33k 72k
1
We know that ∑p ij =1 ⇒ 72k =1 ⇒ k =
72
Hence the joint probability function is given by
x/y 1 2 3 P(x)
0 3 6 9 18
72 72 72 72
1 5 8 11 24
72 72 72 72
2 7 10 13 30
72 72 72 72
P(y) 15 24 33 1
72 72 72
Y=y 1 2 3
P(Y=y) 15 24 33
72 72 72
3 9
+
8
+
7 24
=
(0,3),(1,2),(2,1) 72 72 72 72
4 11 10 21
+ =
(1,3),(2,2) 72 72 72
5 13
(2,3) 72
Total 1
(iii) The conditional distribution of X given Y: P X = x Y = y
P (=X 0,= Y 1) 3 15 1
P ( X= 0 Y= 1)= = =
P () 72 72 5
P (=X 1,= Y 1) 5 15 1
P ( X= 1 Y= 1)= = =
P (Y = 1) 72 72 3
P (=X 2,= Y 1) 7 15 7
P ( X= 2 Y= 1)= = =
P (Y = 1) 72 72 15
P (=X 0,= Y 2) 6 24 1
P ( X= 0 Y= 2)
= = =
P (Y = 2) 72 72 4
P (=X 1,= Y 2) 8 24 1
P ( X= 1 Y= 2)
= = =
P (Y = 2) 72 72 3
P ( X = 2, Y = 2) 10 24 5
P ( X= 2 Y= 2)
= = =
P(Y = 2) 72 72 12
P(= X 0,= Y 3) 9 33 9
P( X= 0 Y= 3)= = =
P (Y = 3) 72 72 33
P (=X 1,= Y 3) 11 33 1
P ( X= 1 Y= 3)= = =
P (Y = 3) 72 72 3
P (=X 2,= Y 3) 13 33 13
P ( X= 2 Y= 3)= = =
P (Y = 3) 72 72 33
Te conditional distribution of X Y = 2
X 0 1 2
P=
( X x=
Y 2) 6 24 1 8 24 1 10 24 5
= = =
72 72 4 72 72 3 72 72 12
Te conditional distribution of X Y = 3
X 0 1 2
P=
( X x=
Y 3) 9 1 13
33 3 33
Te conditional distribution of Y X = 0
X 0 1 2
P=
(Y y=
X 0) 3 18 1 6 18 1 9 18 1
= = =
72 72 6 72 72 3 72 72 2
Te conditional distribution of Y X = 1
X 0 1 2
P=
(Y y=
X 1) 5 1 11
24 3 24
Te conditional distribution of Y X = 2
X 0 1 2
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
P=
(Y y=
X 2) 7 1 13
30 3 30
(OR)
(b) (i) State and prove central limit theorem .
= �[𝑛𝑛(𝑛𝑛 + 1) − 𝑛𝑛]𝑝𝑝(𝑥𝑥𝑛𝑛 )
𝑛𝑛=0
∞ ∞
1 𝑎𝑎𝑎𝑎 (𝑎𝑎𝑎𝑎)2
= �0 + 1.2 + 2.3 + 3.4 + ⋯ � − 𝐸𝐸{𝑋𝑋(𝑡𝑡)}
(1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎)3 (1 + 𝑎𝑎𝑎𝑎)4
2 𝑎𝑎𝑎𝑎 𝑎𝑎𝑡𝑡 2
= �1 + 3. + 6 � � + ⋯� − 1
(1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎) 1 + 𝑎𝑎𝑎𝑎
2 𝑎𝑎𝑎𝑎 −3
= �1 − � −1
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎
2
= 2
. (1 + 𝑎𝑎𝑎𝑎)3 − 1
(1 + 𝑎𝑎𝑎𝑎)
= 2(1 + 𝑎𝑎𝑎𝑎) − 1
= 2 + 2𝑎𝑎𝑎𝑎 − 1
= 1 + 2𝑎𝑎𝑎𝑎
𝑉𝑉𝑉𝑉𝑉𝑉{𝑋𝑋(𝑡𝑡)} = 𝐸𝐸(𝑋𝑋 2 (𝑡𝑡)) − [𝐸𝐸(𝑥𝑥(𝑡𝑡))]2
= 1 + 2𝑎𝑎𝑎𝑎 − 1
= 2𝑎𝑎𝑎𝑎 𝑊𝑊ℎ𝑖𝑖𝑖𝑖ℎ 𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑜𝑜𝑜𝑜 𝑡𝑡
∴ {𝑥𝑥(𝑡𝑡)} 𝑖𝑖𝑖𝑖 𝑛𝑛𝑛𝑛𝑛𝑛 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠
(ii) If the two random variables 𝐴𝐴𝑟𝑟 𝑎𝑎𝑎𝑎𝑎𝑎 𝐵𝐵𝑟𝑟 are uncorrelated with zero mean and
𝐸𝐸(𝐴𝐴2𝑟𝑟 ) = 𝐸𝐸(𝐵𝐵𝑟𝑟2 ) = 𝜎𝜎𝑟𝑟2 , show that the process 𝑋𝑋(𝑡𝑡) = ∑𝑛𝑛𝑟𝑟=1(𝐴𝐴𝑟𝑟 cos 𝜔𝜔𝑟𝑟 𝑡𝑡 + 𝐵𝐵𝑟𝑟 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡 )
is wide sense stationary. What are the mean and autocorrelation of X(t) ?
Ans: Given 𝑋𝑋(𝑡𝑡) = ∑𝑛𝑛𝑟𝑟=1(𝐴𝐴𝑟𝑟 cos 𝜔𝜔𝑟𝑟 𝑡𝑡 + 𝐵𝐵𝑟𝑟 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡 )
𝐸𝐸(𝐴𝐴𝑟𝑟) = 𝐸𝐸(𝐵𝐵𝑟𝑟 ) = 0 -----------------(1)
𝐸𝐸(𝐴𝐴2𝑟𝑟 ) = 𝐸𝐸(𝐵𝐵𝑟𝑟2 ) = 𝜎𝜎𝑟𝑟2 ---------------(2)
𝐸𝐸(𝐴𝐴𝑟𝑟 𝐵𝐵𝑟𝑟 ) = 0 ---------------------------(3)
𝑛𝑛
= 𝐸𝐸 ��(𝐴𝐴𝑟𝑟 cos 𝜔𝜔𝑟𝑟 𝑡𝑡1 + 𝐵𝐵𝑟𝑟 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡1 )(𝐴𝐴𝑟𝑟 cos 𝜔𝜔𝑟𝑟 𝑡𝑡2 + 𝐵𝐵𝑟𝑟 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡2 )�
𝑟𝑟=1
𝑛𝑛
= � 𝜎𝜎𝑟𝑟2 cos 𝜔𝜔𝑟𝑟 𝑡𝑡1 cos 𝜔𝜔𝑟𝑟 𝑡𝑡2 + 𝜎𝜎𝑟𝑟2 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡1 𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔𝑟𝑟 𝑡𝑡2
𝑟𝑟=1
𝑛𝑛
(OR)
(b) (i) Define semi-random telegraph signal process and random telegraph signal process
and prove also that the former is evolutionary and the later is wide sense stationary.
Ans:
Semi-random telegraph signal process
in (0,t) and 𝑋𝑋(𝑡𝑡) = (−1)𝑁𝑁(𝑡𝑡) then {X(t)} is called a semi telegraph process.
𝑒𝑒 −⋋𝑡𝑡 (⋋𝑡𝑡)𝑛𝑛
{N(t)} is a process with 𝑃𝑃{𝑁𝑁(𝑡𝑡) = 𝑟𝑟} = , 𝑛𝑛 = 𝑜𝑜, 1,2 …
𝑛𝑛!
1
𝑖𝑖. 𝑒𝑒. , 𝑃𝑃[𝑋𝑋(𝑡𝑡) = 1] = = 𝑃𝑃[𝑋𝑋(𝑡𝑡) = −1]∞ < 𝑡𝑡 < ∞
2
To prove that {X(t)} is evolutionary.
= 𝑒𝑒 −⋋𝑡𝑡 𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝑡𝑡
= 𝑒𝑒 −⋋𝑡𝑡 𝑠𝑠𝑠𝑠𝑠𝑠ℎ ⋋ 𝑡𝑡
−⋋𝑡𝑡 −⋋𝑡𝑡
𝑒𝑒 ⋋𝑡𝑡 + 𝑒𝑒 −⋋𝑡𝑡 𝑒𝑒 ⋋𝑡𝑡 − 𝑒𝑒 −⋋𝑡𝑡
= 𝑒𝑒 [𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝑡𝑡 − 𝑠𝑠𝑠𝑠𝑠𝑠ℎ ⋋ 𝑡𝑡] = 𝑒𝑒 � − �
2 2
So {X(t)} is evolutionary.
1 1
𝐸𝐸{𝑋𝑋(𝑡𝑡)} = (−1) + (1) = 0 = 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
2 2
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 𝐸𝐸[𝑋𝑋(𝑡𝑡1 )𝑋𝑋(𝑡𝑡2 )]
= 𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = 1] − 𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = −1]
− 𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = −1, 𝑋𝑋(𝑡𝑡2 ) = 1] + 2𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = −1, 𝑋𝑋(𝑡𝑡2 ) = −1]
𝑁𝑁𝑁𝑁𝑁𝑁 𝑡𝑡ℎ𝑒𝑒 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 [𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = 1] = [𝑋𝑋(𝑡𝑡1 ) = −1, 𝑋𝑋(𝑡𝑡2 ) = −1] and
[𝑋𝑋(𝑡𝑡1 ) = −1, 𝑋𝑋(𝑡𝑡2 ) = 1] = [𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = −1]
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 2𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = 1] − 2𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1, 𝑋𝑋(𝑡𝑡2 ) = −1]
= 2𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = 1⁄(𝑡𝑡1 ) = 1] 𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1]
− 2𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = −1⁄(𝑡𝑡1 ) = 1] 𝑃𝑃[𝑋𝑋(𝑡𝑡1 ) = 1]
= 𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = 1⁄(𝑡𝑡1 ) = 1] − 𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = −1⁄(𝑡𝑡1 ) = 1]
Let 𝜏𝜏 = 𝑡𝑡2 − 𝑡𝑡1 then
𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = 1⁄(𝑡𝑡1 ) = 1] = 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒]
𝑒𝑒 −⋋𝜏𝜏 (⋋ 𝜏𝜏)𝑘𝑘
= �
𝑘𝑘!
𝑘𝑘=𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
= 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 0] + 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 2] + 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 4] + ⋯
−⋋𝜏𝜏
(⋋ 𝜏𝜏)2 (⋋ 𝜏𝜏)4
= 𝑒𝑒 �1 + + + ⋯�
2! 4!
= 𝑒𝑒 −⋋𝜏𝜏 𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝜏𝜏
𝑒𝑒 ⋋𝜏𝜏 + 𝑒𝑒 −⋋𝜏𝜏
= 𝑒𝑒 −⋋𝜏𝜏 � �
2
1 + 𝑒𝑒 −⋋𝜏𝜏
=
2
𝑃𝑃[𝑋𝑋(𝑡𝑡2 ) = 1⁄(𝑡𝑡1 ) = −1] = 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 0𝑑𝑑𝑑𝑑]
= 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 1] + 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 3] + 𝑃𝑃[𝑁𝑁(𝜏𝜏) = 5] + ⋯
⋋ 𝜏𝜏 (⋋ 𝜏𝜏)3 (⋋ 𝜏𝜏)5
= 𝑒𝑒 −⋋𝜏𝜏 � + + + ⋯�
1! 2! 4!
= 𝑒𝑒 −⋋𝜏𝜏 𝑠𝑠𝑠𝑠𝑠𝑠 ℎ ⋋ 𝜏𝜏
−⋋𝜏𝜏
𝑒𝑒 ⋋𝜏𝜏 − 𝑒𝑒 −⋋𝜏𝜏
= 𝑒𝑒 � �
2
1 − 𝑒𝑒 −⋋𝜏𝜏
=
2
1 + 𝑒𝑒 −⋋𝜏𝜏 1 − 𝑒𝑒 −⋋𝜏𝜏
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 2 � � − 2� � = 𝑒𝑒 −2⋋𝜏𝜏
2 2
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑎𝑎𝑎𝑎𝑎𝑎𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) = 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝜏𝜏
Hence Random telegraph signal process is WSS.
(ii) If{𝑋𝑋(𝑡𝑡)} is Gaussian1 process with 𝜇𝜇(𝑡𝑡) = 10 and C(𝑡𝑡1 , 𝑡𝑡2 ) = 16𝑒𝑒 − |𝑡𝑡 1 −𝑡𝑡 2 |
. Find the
probability that (1) 𝑋𝑋(10) ≤ 8 and (2) |𝑋𝑋(10) − 𝑋𝑋(6)| ≤ 4
Ans: Since {X(t)} is a Guassian process, any member of {X(t)} is a
normal random variable
By definition C(t 1 ,t 2 )= R(t 1 ,t 2 )-E[X(t 1 )].E[X(t 2 )]
C(t 1 ,t 2 )= Var(X(t 1 ))
C(t 1 ,t 2 )= Var {X(t)}
Now X (10) is a normal random variable with mean µ(10)=10
and variance C(10,10) =16
(a). To find P(x(10)≤8),we have
𝑋𝑋(10)−10 8−10
P(X(10)≤8)=P� ≤ �
4 4
=P[z≤-0.5]
=0.5-P[0≤X≤0.5]
=0.5-0.1915(from normal tables)
=0.3085
(b).To find P(|𝑋𝑋(10) − 𝑋𝑋(6)| ≤ 4)let U=X(10)-X(6).
Here U is also a Random variable and we have
E(U)=E[X(10)-X(6)]
=10-10
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
=0
Var(U)= Var[X(10)-X(6)]
=Var[X(10)]+Var[X(6)]-2Cov[X910,X(6)]
=Cov(10,10)+Cov(6,6)-2Cov(10,6)
=16 𝑒𝑒 −|10−10| + 16𝑒𝑒 −|6−6| − 2 × 16𝑒𝑒 −|10−6|
=16+16-2×16e-4
= 31.4139
𝜎𝜎𝑢𝑢=√31.4139
=5.6048
Now P(|X(10) − X(6)| ≤ 4)=P(|𝑈𝑈| ≤ 4)
= P(-4≤U≤4)
𝑈𝑈−𝐸𝐸(𝑈𝑈)
Where Z=
𝜎𝜎𝜎𝜎
−4−0 4−0
= P� ≤ 𝑍𝑍 ≤ �
5.6048 5.6048
= 2× P[0≤Z≤0.7137]
=2×0.2611
=0.5222
14. (a) (i) The random binary transmission process {𝑋𝑋(𝑡𝑡)} is a WSS with zero mean and auto
|𝜏𝜏|
correlation function 𝑅𝑅(𝜏𝜏) = 1 − , where T is a constant. Find the mean and
𝑇𝑇
variance of the time average of {𝑋𝑋(𝑡𝑡)} over (0, 𝑇𝑇). Is {𝑋𝑋(𝑡𝑡)} mean ergodic?
Ans: By definition
1 𝑇𝑇
����
𝑋𝑋𝑇𝑇 = � 𝑋𝑋(𝑡𝑡) 𝑑𝑑𝑑𝑑
𝑇𝑇 0
𝐸𝐸[𝑋𝑋𝑇𝑇 ] = 𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 0
1 𝑇𝑇 |𝜏𝜏|
����
𝑉𝑉(𝑋𝑋𝑇𝑇 ) = � �1 − � 𝑅𝑅(𝜏𝜏) 𝑑𝑑𝑑𝑑
𝑇𝑇 −𝑇𝑇 𝑇𝑇
𝑇𝑇 |𝜏𝜏| |𝜏𝜏|
1
= � �1 − � �1 − � 𝑑𝑑𝑑𝑑
𝑇𝑇 −𝑇𝑇 𝑇𝑇 𝑇𝑇
𝑇𝑇 2
2 |𝜏𝜏|
= � �1 − � 𝑑𝑑𝑑𝑑
𝑇𝑇 0 𝑇𝑇
𝑇𝑇 2
2 𝜏𝜏 2𝜏𝜏
= � �1 + 2 − � 𝑑𝑑𝑑𝑑
𝑇𝑇 0 𝑇𝑇 𝑇𝑇
3 𝑇𝑇
2 𝜏𝜏 2𝜏𝜏 2
= �𝜏𝜏 + 2 − �
𝑇𝑇 3𝑇𝑇 2𝑇𝑇 0
2 𝑇𝑇 2 𝑇𝑇
= �𝑇𝑇 + − 𝑇𝑇� = ×
𝑇𝑇 3 𝑇𝑇 3
2
∴ lim 𝑉𝑉(𝑋𝑋 ����𝑇𝑇 ) = ≠ 0
𝑇𝑇→∞ 3
i.e., The condition for mean ergodic of X(t) is not satisfied. Therefore X(t) is
not mean ergodic.
(ii) Find the power spectral density of a WSS process with auto correlation function
2
𝑅𝑅(𝜏𝜏) = 𝑒𝑒 − 𝛼𝛼𝜏𝜏 .
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
∞
Ans:
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
− 𝛼𝛼𝜏𝜏 2
= � 𝑒𝑒 𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
2 +𝑖𝑖𝑖𝑖𝑖𝑖 �
= � 𝑒𝑒 − 𝛼𝛼�𝜏𝜏 𝛼𝛼 𝑑𝑑𝑑𝑑
−∞
∞ 𝑖𝑖𝑖𝑖𝑖𝑖 𝑖𝑖𝑖𝑖 2 𝑖𝑖𝑖𝑖 2
− 𝛼𝛼�𝜏𝜏 2 + 𝛼𝛼 +�2𝛼𝛼 � −�2𝛼𝛼 � �
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
−∞
∞ 𝑖𝑖𝑖𝑖 2 𝜔𝜔 2
− 𝛼𝛼��𝜏𝜏+2𝛼𝛼 � + 2 �
4𝛼𝛼
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
−∞
∞ 𝑖𝑖𝑖𝑖 2 𝛼𝛼𝛼𝛼 2
− 𝛼𝛼��𝜏𝜏+2𝛼𝛼 � � −
= � 𝑒𝑒 𝑒𝑒 4𝛼𝛼 2 𝑑𝑑𝑑𝑑
−∞
𝜔𝜔 2 ∞ 𝑖𝑖𝑖𝑖 2
− 4𝛼𝛼 − 𝛼𝛼��𝜏𝜏+2𝛼𝛼 � �
= 𝑒𝑒 � 𝑒𝑒 𝑑𝑑𝑑𝑑
−∞
𝑖𝑖𝑖𝑖
Put 𝑢𝑢 = 𝜏𝜏 +
2𝛼𝛼
𝑑𝑑𝑑𝑑 = 𝑑𝑑𝑑𝑑
𝜔𝜔 2 ∞
2
= 𝑒𝑒 − 4𝛼𝛼 � 𝑒𝑒 −𝛼𝛼𝑢𝑢 𝑑𝑑𝑑𝑑
−∞
𝜔𝜔 2 ∞
− 4𝛼𝛼 2
= 𝑒𝑒 � 𝑒𝑒 −(√𝛼𝛼𝑢𝑢) 𝑑𝑑𝑑𝑑
−∞
𝜋𝜋 − 𝜔𝜔 2
= � 𝑒𝑒 4𝛼𝛼
𝛼𝛼
(OR)
(b) (i) A random process {𝑋𝑋(𝑡𝑡)} is given by 𝑋𝑋(𝑡𝑡) = 𝐴𝐴 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 + 𝐵𝐵 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, where A and B are
independent random variables such that 𝐸𝐸(𝐴𝐴) = 𝐸𝐸(𝐵𝐵) = 0 𝑎𝑎𝑎𝑎𝑎𝑎 𝐸𝐸(𝐴𝐴2 ) = 𝐸𝐸(𝐵𝐵2 ) =
𝜎𝜎 2 . Find the power spectral density of the process.
Ans: Given 𝑋𝑋(𝑡𝑡) = 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵
And the random variables A and B satisfy
𝐸𝐸(𝐴𝐴) = 𝐸𝐸(𝐵𝐵) = 0, 𝐸𝐸(𝐴𝐴𝐴𝐴) = 0, 𝐸𝐸(𝐴𝐴2 ) = 𝐸𝐸(𝐵𝐵2 ) = 𝜎𝜎 2
We have to prove
(𝑖𝑖)𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 0
(𝑖𝑖𝑖𝑖)𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) 𝑖𝑖𝑖𝑖 𝑎𝑎 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝜏𝜏
∫ a ( a − ω ) ( cos ωτ + i sin ωτ ) dω
1 b
=
2π −a
a
2 b
=
2π ∫ a ( a − ω ) cos ωτ dω
0
a
b b
=
aπ ∫ a ( a − ω ) cos ωτ dω
0
a
b sin ωτ cos ωτ
=( a − ω ) −
aπ τ τ 2 0
b
= (1 − cos aτ )
aπτ 2
2b aτ
= sin 2
aπτ 2
2
15. (a) (i) Check whether the following systems are linear (1) 𝑌𝑌(𝑡𝑡) = 𝑡𝑡𝑡𝑡(𝑡𝑡) (2) 𝑌𝑌(𝑡𝑡) = 𝑋𝑋 2 (𝑡𝑡).
Ans: (1) If 𝑋𝑋(𝑡𝑡) = 𝑎𝑎1 𝑋𝑋1 (𝑡𝑡) + 𝑎𝑎2 𝑋𝑋2 (𝑡𝑡)
∞
𝑌𝑌(𝑡𝑡) = � ℎ(𝛼𝛼)𝑋𝑋(𝑡𝑡 − 𝛼𝛼) 𝑑𝑑𝑑𝑑
−∞
𝑌𝑌(𝑡𝑡) = 𝑡𝑡𝑡𝑡(𝑡𝑡)
= 𝑡𝑡[𝑎𝑎1 𝑋𝑋1 (𝑡𝑡) + 𝑎𝑎2 𝑋𝑋2 (𝑡𝑡)]
∞
= 𝑡𝑡 � ℎ(𝛼𝛼)[𝑎𝑎1 𝑋𝑋1 (𝑡𝑡 − 𝛼𝛼) + 𝑎𝑎2 𝑋𝑋2 (𝑡𝑡 − 𝛼𝛼)] 𝑑𝑑𝑑𝑑
−∞
∞
= 𝑡𝑡 � ℎ(𝛼𝛼)𝑎𝑎1 𝑋𝑋1 (𝑡𝑡 − 𝛼𝛼) + ℎ(𝛼𝛼)𝑎𝑎2 𝑋𝑋2 (𝑡𝑡 − 𝛼𝛼) 𝑑𝑑𝑑𝑑
−∞
(ii) The power spectral density of a signal X(t) is𝑆𝑆𝑋𝑋 (𝜔𝜔) and its power is P. Find the
power of the signal 𝑏𝑏𝑏𝑏(𝑡𝑡).
Ans: Let 𝑌𝑌(𝑡𝑡) = 𝑏𝑏𝑏𝑏(𝑡𝑡)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝐸𝐸[𝑌𝑌(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[𝑏𝑏𝑏𝑏(𝑡𝑡)𝑏𝑏𝑏𝑏(𝑡𝑡 + 𝜏𝜏)]
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝑏𝑏 2 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)
1. X and Y are independent random variables with variance 2 and 3. Find the variance of
3𝑋𝑋 + 4𝑌𝑌.
Ans: Given 𝑉𝑉(𝑋𝑋) = 2 𝑎𝑎𝑎𝑎𝑎𝑎 𝑉𝑉(𝑌𝑌) = 3.
𝑉𝑉(3𝑋𝑋 + 4𝑌𝑌) = 32 𝑉𝑉(𝑋𝑋) + 42 𝑉𝑉(𝑌𝑌) = 9𝑥𝑥2 + 16𝑥𝑥3 = 18 + 48 = 66
2. A continuous random variable X has probability density function 𝑓𝑓(𝑥𝑥) =
3𝑥𝑥 2 , 0 ≤ 𝑥𝑥 ≤ 1
� Find K such that 𝑃𝑃(𝑋𝑋 > 𝑘𝑘) = 0.5
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
Ans: 𝑃𝑃(𝑋𝑋 > 𝑘𝑘) = 0.5
1
� 3𝑥𝑥 2 𝑑𝑑𝑑𝑑 = 0.5
𝑘𝑘
1
𝑥𝑥 3
3 � � = 0.5
3 𝑘𝑘
1 − 𝑘𝑘 3 = 0.5
−𝑘𝑘 3 = 0.5 − 1
𝑘𝑘 3 = 0.5
𝑘𝑘 = (0.5)1/3
3. State central Limit Theorem for iid random variables.
Ans: out of syllabus for 2013 regulation
4. State the basic properties of joint distribution of (X,Y) when X and Y are random
variables.
Ans: (𝑖𝑖) 0 ≤ 𝐹𝐹(𝑥𝑥, 𝑦𝑦) ≤ 1
(𝑖𝑖𝑖𝑖) 𝑃𝑃[𝑎𝑎 < 𝑋𝑋 < 𝑏𝑏, 𝑌𝑌 ≤ 𝑦𝑦] = 𝐹𝐹(𝑏𝑏, 𝑦𝑦) − 𝐹𝐹(𝑎𝑎, 𝑦𝑦)
(𝑖𝑖𝑖𝑖𝑖𝑖) 𝑃𝑃[𝑋𝑋 ≤ 𝑥𝑥, 𝑐𝑐 < 𝑌𝑌 < 𝑑𝑑] = 𝐹𝐹(𝑥𝑥, 𝑑𝑑) − 𝐹𝐹(𝑥𝑥, 𝑐𝑐)
(𝑖𝑖𝑖𝑖) 𝑃𝑃[𝑎𝑎 < 𝑋𝑋 < 𝑏𝑏, 𝑐𝑐 < 𝑌𝑌 < 𝑑𝑑] = 𝐹𝐹(𝑏𝑏, 𝑑𝑑) − 𝐹𝐹(𝑎𝑎, 𝑑𝑑) − 𝐹𝐹(𝑏𝑏, 𝑐𝑐) + 𝐹𝐹(𝑎𝑎, 𝑐𝑐)
(𝑣𝑣) 𝐹𝐹(𝑥𝑥, 𝑦𝑦) is non- decreasing function.
(𝑣𝑣𝑣𝑣) 𝐹𝐹(−∞, 𝑦𝑦) = 0, 𝐹𝐹(𝑥𝑥, −∞) = 0, 𝐹𝐹(∞, ∞) = 1
𝜕𝜕 2 𝐹𝐹(𝑥𝑥, 𝑦𝑦)
(𝑣𝑣𝑣𝑣𝑣𝑣) 𝐴𝐴𝐴𝐴 𝑡𝑡ℎ𝑒𝑒 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑜𝑜𝑜𝑜 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑜𝑜𝑜𝑜 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = 𝑓𝑓(𝑥𝑥, 𝑦𝑦)
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕
5. State the properties of an ergodic process.
Ans: A random process X(t) is said to be ergodic if the ensemble averages are equal
to the corresponding time averages.
6. Explain any two application of binomial process.
Ans: out of syllabus for 2013 regulation
7. Define cross correlation function and state any two of its properties.
Ans: The cross correlation of X(t) and Y(t) is denoted by 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2) and is defined
4
= � lim 25 + = √25 = 5
𝜏𝜏→∞ 1 + 6𝜏𝜏 2
2 (𝑡𝑡)�
𝐸𝐸�𝑋𝑋 = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
= 25 + 4 = 29
2 (𝑡𝑡)�
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸�𝑋𝑋 − 𝐸𝐸(𝑋𝑋(𝑡𝑡))
29 − 52 = 4
9. Define a system. When is it called a linear system?
Ans: Mathematically, a system is a functional relation between input X(t) and
output Y(t). Symbolically, 𝑌𝑌(𝑡𝑡) = 𝑓𝑓[𝑋𝑋(𝑡𝑡)], −∞ < 𝑡𝑡 < ∞
The system said to be linear if for any two inputs 𝑋𝑋1 (𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑋𝑋2 (𝑡𝑡)and
constants 𝑎𝑎1 , 𝑎𝑎2
𝑓𝑓[𝑎𝑎1 𝑋𝑋1 (𝑡𝑡) + 𝑎𝑎2 𝑋𝑋2 (𝑡𝑡)] = 𝑎𝑎1 𝑓𝑓[𝑋𝑋1 (𝑡𝑡)] + 𝑎𝑎2 𝑓𝑓[𝑋𝑋2 (𝑡𝑡)]
10. Define Band-Limited white noise.
Ans: Noise having a non-zero and constant power spectral density over a finite
frequency abnd and zero elsewhere is called band-limited white noise.
∴ the PSD of the Band limited white noise is given by
𝑁𝑁0
|𝜔𝜔| ≤ 𝑊𝑊𝐵𝐵
𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔) = � 2 , 𝑓𝑓𝑓𝑓𝑓𝑓
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
Part – B
11. (a) (i) Define the moment generating function of a random variable? Derive the MGF ,
mean, variance and the first four moments of a Gamma distribution.
Ans: The moment generating function of a random variable X defined as 𝑀𝑀𝑋𝑋 (𝑡𝑡) =
∑∞ 𝑖𝑖=1 𝑒𝑒
𝑡𝑡𝑥𝑥 𝑖𝑖
𝑃𝑃(𝑥𝑥𝑖𝑖 ) 𝑖𝑖𝑖𝑖 𝑋𝑋 𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑅𝑅𝑅𝑅
𝑡𝑡𝑡𝑡 ]
𝐸𝐸[𝑒𝑒 =� ∞ 𝑡𝑡𝑡𝑡
∫−∞ 𝑓𝑓(𝑥𝑥)𝑒𝑒 𝑑𝑑𝑑𝑑 𝑖𝑖𝑖𝑖 𝑋𝑋 𝑖𝑖𝑖𝑖 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑅𝑅𝑅𝑅
Moment Generating Function of Gamma distribution
The Gamma distribution is
∝−1
𝜆𝜆 𝑒𝑒 − 𝜆𝜆𝜆𝜆 ( 𝜆𝜆𝜆𝜆 )
𝑓𝑓(𝑥𝑥) = 𝑥𝑥 ≥ 0
Γ𝛼𝛼
𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸[𝑒𝑒 𝑡𝑡𝑡𝑡 ]
∞
= � 𝑓𝑓(𝑥𝑥)𝑒𝑒 𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑
−∞
∞ ∝−1
𝑡𝑡𝑡𝑡
𝜆𝜆 𝑒𝑒 − 𝜆𝜆𝜆𝜆 ( 𝜆𝜆𝜆𝜆 )
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
0 Γ𝛼𝛼
′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−1 1
𝑀𝑀𝑋𝑋 = −𝛼𝛼 �1 − � �− ��
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼
𝜇𝜇′1 =
𝜆𝜆
′′(𝑡𝑡) 𝑡𝑡 − 𝛼𝛼−2 1 2
𝑀𝑀𝑋𝑋 = 𝛼𝛼[−(∝ +1)] �1 − � � � �
𝜆𝜆 𝜆𝜆 𝑡𝑡=0
𝛼𝛼(∝ +1)
𝜇𝜇′2 =
𝜆𝜆2
2
𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 (𝑋𝑋) = 𝜇𝜇′2 − �𝜇𝜇′1 �
𝛼𝛼(∝ +1) 𝛼𝛼 2
= − � �
𝜆𝜆2 𝜆𝜆
𝛼𝛼 2 +∝ −𝛼𝛼 2
=
𝜆𝜆2
∝
∴ 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉(𝑋𝑋) = 2
𝜆𝜆
(ii) Describe Binomial 𝐵𝐵(𝑛𝑛, 𝑝𝑝) distribution and obtain the moment generating function.
Hence compute (1) the first four moments and (2) the recursion relation for the central
moments.
Ans: Binomial Distribution
A random variable X is said to follow a Binomial distribution if it assumes
only no-negative values with probability mass function
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) = 𝑛𝑛𝑛𝑛𝑥𝑥 𝑝𝑝 𝑥𝑥 𝑞𝑞 𝑛𝑛−𝑥𝑥 , 𝑥𝑥 = 0,1,2,3, … , 𝑛𝑛 𝑎𝑎𝑎𝑎𝑎𝑎 𝑞𝑞 = 1 − 𝑝𝑝
Moment generating function of Binomial distribution
𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸(𝑒𝑒 𝑡𝑡𝑡𝑡 )
𝑛𝑛
= � 𝑒𝑒 𝑡𝑡𝑡𝑡 𝑝𝑝(𝑥𝑥)
𝑥𝑥=0
1 2 2 3 1 81
= + + + + =
10 10 10 10 100 100
81 19
𝑃𝑃(𝑋𝑋 ≥ 6) = 1 − 𝑃𝑃(𝑋𝑋 < 6) = 1 − =
100 100
𝑃𝑃(0 < 𝑋𝑋 < 5) = 𝑃𝑃(𝑋𝑋 = 0) + 𝑃𝑃(𝑋𝑋 = 1) + ⋯ . +𝑃𝑃(𝑋𝑋 = 4)
1 2 2 3
= + + +
10 10 10 10
8 4
= =
10 5
1
(𝑖𝑖𝑖𝑖𝑖𝑖) 𝑃𝑃(𝑋𝑋 ≤ 3) =
2
8 1
𝑃𝑃(𝑋𝑋 ≤ 4) = >
10 2
𝑇𝑇ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑛𝑛 = 4
(𝑖𝑖𝑖𝑖)The distribution function 𝐹𝐹𝑋𝑋 (𝑥𝑥) of X is given by
X 0 1 2 3 4 5 6 7
0 1 3 5 8 81 83 100
=1
𝐹𝐹𝑋𝑋 (𝑥𝑥) = 𝑃𝑃(𝑋𝑋 10 10 10 10 100 100 100
≤ 𝑥𝑥)
𝑥𝑥
(ii) , 𝑥𝑥 > 0
Find the MGF of a random variable X having the pdf 𝑓𝑓(𝑥𝑥) = � 4𝑒𝑒 𝑥𝑥 /2
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
Also deduce the first four moments about the origin.
𝑥𝑥
Ans: , 𝑥𝑥 > 0
Given 𝑓𝑓(𝑥𝑥) = � 𝑥𝑥 /2 4𝑒𝑒
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
𝑥𝑥 − 𝑥𝑥
𝑒𝑒 2 , 𝑥𝑥 > 0
= �4
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸[𝑒𝑒 𝑡𝑡𝑡𝑡 ]
∞
= � 𝑓𝑓(𝑥𝑥)𝑒𝑒 𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑
−∞
∞
𝑥𝑥 − 𝑥𝑥
= � 𝑒𝑒 𝑡𝑡𝑡𝑡 𝑒𝑒 2 𝑑𝑑𝑑𝑑
0 4
∞
𝑥𝑥 − �1−𝑡𝑡�𝑥𝑥
=� 𝑒𝑒 2 𝑑𝑑𝑑𝑑
0 4
1 1 ∞
− �2 −𝑡𝑡�𝑥𝑥 − �2 −𝑡𝑡�𝑥𝑥
𝑥𝑥 𝑒𝑒 1 𝑒𝑒
=� � �− � 2 ��
4 − � − 𝑡𝑡�
1 4 1
� − 𝑡𝑡�
2 2 0
1 1
= �0 − 0 + 0 + 2�
4 1
� − 𝑡𝑡�
2
1 1 1 4 1
= 2 = =
4 1 4 (1 − 2𝑡𝑡)2 (1 − 2𝑡𝑡)2
� − 𝑡𝑡�
2
= (1 − 2𝑡𝑡)−2
= 1 + 2(2𝑡𝑡) + 3(2𝑡𝑡)2 + 4(2𝑡𝑡)3 + 5(2𝑡𝑡)4 + ⋯
= 1 + 4𝑡𝑡 + 12𝑡𝑡 2 + 32𝑡𝑡 3 + 80𝑡𝑡 4 + ⋯
𝑡𝑡 2 𝑡𝑡 3 𝑡𝑡 4
= 1 + 4 𝑡𝑡 + (24) + (192) + (1920) + ⋯
2! 3! 4!
𝑡𝑡 𝑟𝑟
Now 𝜇𝜇′𝑟𝑟 = 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑜𝑜𝑜𝑜
𝑟𝑟!
The first four moments are
𝜇𝜇′1 = 4, 𝜇𝜇′2 = 24, 𝜇𝜇′3 = 192, 𝜇𝜇′4 = 1920
12 (a) (i) If the joint pdf of two dimensional random variable (X,Y) is given by 𝑓𝑓(𝑥𝑥, 𝑦𝑦) =
𝑥𝑥𝑥𝑥
𝑥𝑥 2 + , 0 < 𝑥𝑥 < 1; 0 < 𝑦𝑦 < 2
� 3
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
Find
1
(1) 𝑃𝑃 �𝑋𝑋 > �
2
(2) 𝑃𝑃(𝑌𝑌 < 𝑋𝑋) and
(3) Find the conditional density function.
𝑥𝑥𝑥𝑥
Ans: 𝑥𝑥 2 + , 0 < 𝑥𝑥 < 1; 0 < 𝑦𝑦 < 2
Given 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = � 3
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
The marginal density of X
∞
𝑓𝑓𝑋𝑋 (𝑥𝑥) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
−∞
2
𝑥𝑥𝑥𝑥
= � 𝑥𝑥 2 + 𝑑𝑑𝑑𝑑
0 3
2
2
𝑥𝑥𝑦𝑦 2
= �𝑥𝑥 𝑦𝑦 + �
3×2 0
4𝑥𝑥
= 2𝑥𝑥 2 +
3×2
2𝑥𝑥
= 2𝑥𝑥 2 + , 0 < 𝑥𝑥 < 1
3
The marginal density of X
∞
𝑓𝑓𝑌𝑌 (𝑦𝑦) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
−∞
1
𝑥𝑥𝑥𝑥
= � 𝑥𝑥 2 + 𝑑𝑑𝑑𝑑
0 3
1
𝑥𝑥 2 𝑥𝑥 2 𝑦𝑦
=� + �
3 6 0
1 𝑦𝑦
=� + �
3 6
𝑦𝑦 + 2
=� �, 0 < 𝑦𝑦 < 2
6
1
3
𝑥𝑥 3
= � �𝑥𝑥 + � 𝑑𝑑𝑑𝑑
0 6
1
7
= � 𝑥𝑥 3 𝑑𝑑𝑑𝑑
0 6
1
7 𝑥𝑥 4
= � �
6 4 0
7
=
24
Conditional density function
𝑥𝑥𝑥𝑥
𝑓𝑓(𝑥𝑥, 𝑦𝑦) 𝑥𝑥 2 +
3
𝑓𝑓(𝑦𝑦⁄𝑥𝑥) = = 2𝑥𝑥
𝑓𝑓(𝑥𝑥) 2
2𝑥𝑥 +
3
3𝑥𝑥 2 + 𝑥𝑥𝑥𝑥
= 2 , 0 < 𝑥𝑥 < 1, 0 < 𝑦𝑦 < 2
6𝑥𝑥 + 2𝑥𝑥
2 𝑥𝑥𝑥𝑥
𝑓𝑓(𝑥𝑥, 𝑦𝑦) 𝑥𝑥 + 3
𝑓𝑓(𝑥𝑥 ⁄𝑦𝑦) = = 𝑦𝑦+2
𝑓𝑓(𝑦𝑦)
6
2(3𝑥𝑥 2 + 𝑥𝑥𝑥𝑥)
= , 0 < 𝑥𝑥 < 1, 0 < 𝑦𝑦 < 2
𝑦𝑦 + 2
(OR)
(b) (i) The joint pdf of the random variables (X,Y) is 𝑓𝑓(𝑥𝑥, 𝑦𝑦)3(𝑥𝑥 + 𝑦𝑦), 0 ≤ 𝑥𝑥 ≤ 1, 0 ≤ 𝑦𝑦 ≤
1, 𝑥𝑥 + 𝑦𝑦 ≤ 1. Find 𝐶𝐶𝐶𝐶𝐶𝐶(𝑋𝑋, 𝑌𝑌).
Ans:
3
= [1 − 𝑦𝑦 2 ], 0 ≤ 𝑦𝑦 ≤ 1
2
1
𝐸𝐸(𝑋𝑋) = � 𝑥𝑥 𝑓𝑓𝑋𝑋 (𝑥𝑥) 𝑑𝑑𝑑𝑑
0
1
3
= � 𝑥𝑥[1 − 𝑥𝑥 2 ] 𝑑𝑑𝑑𝑑
0 2
3 1
= � (𝑥𝑥 − 𝑥𝑥 3 )𝑑𝑑𝑑𝑑
2 0
1 1−𝑦𝑦
𝑥𝑥 3 𝑦𝑦 𝑥𝑥 2 𝑦𝑦 2
= 3� � + � 𝑑𝑑𝑑𝑑
0 3 2 0
1
(1 − 𝑦𝑦)3 𝑦𝑦 (1 − 𝑦𝑦)2 𝑦𝑦 2
= 3� � + � 𝑑𝑑𝑑𝑑
0 3 2
1
1 1
= 3 � [𝑦𝑦 − 3𝑦𝑦 2 + 3𝑦𝑦 3 − 𝑦𝑦 4 ] + [𝑦𝑦 2 − 2𝑦𝑦 3 + 𝑦𝑦 4 ] 𝑑𝑑𝑑𝑑
0 3 2
1
1 𝑦𝑦 2 3𝑦𝑦 3 3𝑦𝑦 4 𝑦𝑦 5 1 𝑦𝑦 3 2𝑦𝑦 4 𝑦𝑦 5
= 3� � − + − �+ � − + ��
3 2 3 4 5 2 3 4 5 0
1 1 3 1 1 1 1 1 1
= 3 � � − 1 + − � + � − + �� =
3 2 4 5 2 3 2 5 10
𝑐𝑐𝑐𝑐𝑐𝑐(𝑋𝑋, 𝑌𝑌) = 𝐸𝐸(𝑋𝑋𝑋𝑋) − 𝐸𝐸(𝑋𝑋) − 𝐸𝐸(𝑌𝑌)
1 3 3
= −� × �
10 8 8
13
=−
320
(ii) Marks obtained by 10 students in Mathematics (X) and Statistics (Y) are given below:
X 60 34 40 50 45 40 22 43 42 64
Y 75 32 33 40 45 33 12 30 34 51
Find the two regression lines. Also find Y when X=55.
Ans:
𝑥𝑥 𝑦𝑦 𝑥𝑥 2 𝑦𝑦 2 𝑥𝑥𝑥𝑥
60 75 3600 5625 4500
34 32 1156 1024 1088
40 33 1600 1089 1320
50 40 2500 1600 2000
45 45 2025 2025 2025
40 33 1600 1089 1320
22 12 484 144 264
43 30 1849 900 1290
42 34 1764 1156 1428
64 51 4096 2601 3264
Total 440 385 20674 17253 18499
∑ 𝑥𝑥 440 ∑ 𝑦𝑦 385
𝑥𝑥̅ = = = 44 𝑎𝑎𝑎𝑎𝑎𝑎 𝑦𝑦� = = = 38.5
𝑛𝑛 10 𝑛𝑛 10
𝑛𝑛 ∑ 𝑥𝑥𝑥𝑥 − ∑ 𝑥𝑥 ∑ 𝑦𝑦 (10 × 18499) − (440 × 385)
𝑏𝑏𝑦𝑦𝑦𝑦 = = = 1.1865
𝑛𝑛 ∑ 𝑥𝑥 2 − (∑ 𝑥𝑥)2 (10 × 20674) − (440)2
𝑛𝑛 ∑ 𝑥𝑥𝑥𝑥 − ∑ 𝑥𝑥 ∑ 𝑦𝑦 (10 × 18499) − (440 × 385)
𝑏𝑏𝑥𝑥𝑥𝑥 = = = 0.6414
𝑛𝑛 ∑ 𝑦𝑦 2 − (∑ 𝑦𝑦)2 (10 × 17253) − (385)2
Regression line of y on x is
𝑦𝑦 − 𝑦𝑦� = 𝑏𝑏𝑦𝑦𝑦𝑦 (𝑥𝑥 − 𝑥𝑥̅ )
𝑦𝑦 − 38.5 = 1.1865(𝑥𝑥 − 44)
⟹ 𝑦𝑦 = 1.1865𝑥𝑥 − 13.706
𝑤𝑤ℎ𝑒𝑒𝑒𝑒 𝑥𝑥 = 55, 𝑦𝑦 = (1.1865 × 55) − 13.706 = 51.55
Regression line of x on y is
𝑥𝑥 − 𝑥𝑥̅ = 𝑏𝑏𝑥𝑥𝑥𝑥 (𝑦𝑦 − 𝑦𝑦�)
𝑥𝑥 − 44 = 0.6414(𝑦𝑦 − 38.5)
⟹ 𝑥𝑥 = 0.6414𝑦𝑦 − 19.306
13 (a) (i) The process {𝑋𝑋(𝑡𝑡)} whose probability distribution under certain condition is given by
(𝑎𝑎𝑎𝑎 )𝑛𝑛 −1
𝑛𝑛 −1 , 𝑛𝑛 = 1,2,3, …
𝑃𝑃{𝑋𝑋(𝑡𝑡) = 𝑛𝑛} = �(1+𝑎𝑎𝑎𝑎 ) 𝑎𝑎𝑎𝑎
, 𝑛𝑛 = 0
1+𝑎𝑎𝑎𝑎
Mean and variance of the process. Is the process first order stationary?
Ans: Given
X(t)=n 0 1 2 3 ...
P{X(t)}=p(x 𝑎𝑎𝑎𝑎 1 𝑎𝑎𝑎𝑎 (𝑎𝑎𝑎𝑎) 2 ...
1 𝑎𝑎𝑎𝑎 𝑎𝑎𝑎𝑎 2
= �1 + 2 + 3 � � + ⋯�
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎 1 + 𝑎𝑎𝑎𝑎
1 𝑎𝑎𝑎𝑎 −2
= �1 − �
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎
1 1 + 𝑎𝑎𝑎𝑎 − 𝑎𝑎𝑎𝑎 −2
= � �
(1 + 𝑎𝑎𝑎𝑎)2 1 + 𝑎𝑎𝑎𝑎
1 1
= ×
(1 + 𝑎𝑎𝑎𝑎)2 (1 + 𝑎𝑎𝑎𝑎)−2
1
= (1 + 𝑎𝑎𝑎𝑎)2
(1 + 𝑎𝑎𝑎𝑎)2
= 1, 𝑤𝑤ℎ𝑖𝑖𝑖𝑖ℎ 𝑖𝑖𝑖𝑖 𝑎𝑎 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
Now
∞
𝑛𝑛=0
∞
= �[𝑛𝑛(𝑛𝑛 + 1) − 𝑛𝑛]𝑝𝑝(𝑥𝑥𝑛𝑛 )
𝑛𝑛=0
∞ ∞
(ii) If the WSS process {𝑋𝑋(𝑡𝑡)} is given by 𝑋𝑋(𝑡𝑡) = 10 cos(100𝑡𝑡 + 𝜃𝜃), where 𝜃𝜃 is
uniformly distributed over(−𝜋𝜋, 𝜋𝜋), prove that {𝑋𝑋(𝑡𝑡)} is correlation ergodic.
Ans: We Know that
𝑅𝑅𝑥𝑥𝑥𝑥(𝜏𝜏) = 𝐸𝐸(𝑋𝑋(𝑡𝑡)𝑋𝑋(𝑡𝑡 + 𝜏𝜏))
= 𝐸𝐸(100𝑐𝑐𝑐𝑐𝑐𝑐(100𝑡𝑡 + 𝜃𝜃)𝑐𝑐𝑐𝑐𝑐𝑐(100𝑡𝑡 + 100 𝜏𝜏 + 𝜃𝜃))
= 100𝐸𝐸(𝑐𝑐𝑐𝑐𝑐𝑐(100𝑡𝑡 + 𝜃𝜃 + 100𝑡𝑡 + 100 𝜏𝜏 + 𝜃𝜃) + 𝑐𝑐𝑐𝑐𝑐𝑐(100𝑡𝑡 + 𝜃𝜃 − 100𝑡𝑡
− 100 𝜏𝜏 − 𝜃𝜃))
(OR)
(b) (i) If the process {𝑋𝑋(𝑡𝑡) ≥ 0}is a poisson process with parameter 𝜆𝜆, obtain 𝑃𝑃(𝑋𝑋(𝑡𝑡) =
𝑛𝑛). Is the process first order stationary?
Ans:
Probability Law for the Poisson Process {𝑋𝑋(𝑡𝑡)}
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 = 𝐸𝐸[𝑋𝑋(𝑡𝑡)]
Let 𝜆𝜆, be the number of occurrences of the event in unit time.
𝑃𝑃𝑛𝑛 (𝑡𝑡) = 𝑃𝑃[𝑋𝑋(𝑡𝑡) = 𝑛𝑛]
Let 𝑃𝑃𝑛𝑛 (𝑡𝑡 + ∆𝑡𝑡) = 𝑃𝑃[𝑋𝑋(𝑡𝑡 + ∆𝑡𝑡) = 𝑛𝑛]
= 𝑃𝑃[(𝑛𝑛 − 1)𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 (0, 𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 1 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 (𝑡𝑡, 𝑡𝑡 + ∆𝑡𝑡)]
+ 𝑃𝑃[𝑛𝑛 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 (0, 𝑡𝑡)𝑎𝑎𝑎𝑎𝑎𝑎 𝑛𝑛𝑛𝑛 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 (𝑡𝑡, 𝑡𝑡 + ∆𝑡𝑡)]
= 𝑃𝑃𝑛𝑛−1 (𝑡𝑡)𝜆𝜆∆𝑡𝑡 + 𝑃𝑃𝑛𝑛 (𝑡𝑡)(1 − 𝜆𝜆∆𝑡𝑡)
𝜆𝜆𝑛𝑛 𝑛𝑛 ′ 𝜆𝜆𝑛𝑛 𝑡𝑡 𝑛𝑛
𝑡𝑡 𝑓𝑓 (𝑡𝑡) = − 𝜆𝜆 𝑓𝑓(𝑡𝑡)
𝑛𝑛! 𝑛𝑛!
∞
(𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆
= � 𝑛𝑛
𝑛𝑛!
𝑛𝑛=0
= � 𝑛𝑛𝑃𝑃𝑛𝑛 (𝑡𝑡)
𝑛𝑛=0
∞
(𝜆𝜆𝜆𝜆)𝑛𝑛 𝑒𝑒 −𝜆𝜆𝜆𝜆
= � 𝑛𝑛
𝑛𝑛!
𝑛𝑛=0
∞
−𝜆𝜆𝜆𝜆
(𝜆𝜆𝜆𝜆)𝑛𝑛
= 𝑒𝑒 � 𝑛𝑛
𝑛𝑛(𝑛𝑛 − 1)!
𝑛𝑛=1
−𝜆𝜆𝜆𝜆
(𝜆𝜆𝜆𝜆)1 (𝜆𝜆𝜆𝜆)2 (𝜆𝜆𝜆𝜆)3
= 𝑒𝑒 � + + + ⋯�
0! 1! 2!
−𝜆𝜆𝜆𝜆
(𝜆𝜆𝜆𝜆)1 (𝜆𝜆𝜆𝜆)2
= 𝑒𝑒 𝜆𝜆𝜆𝜆 �1 + + + ⋯�
1! 2!
= 𝑒𝑒 −𝜆𝜆𝜆𝜆 𝜆𝜆𝜆𝜆 𝑒𝑒 𝜆𝜆𝜆𝜆
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 = 𝜆𝜆𝜆𝜆
∞
(ii) Prove that the random processes 𝑋𝑋(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌(𝑡𝑡) defined by 𝑋𝑋(𝑡𝑡) = 𝐴𝐴 cos 𝜔𝜔𝜔𝜔 +
𝐵𝐵 𝑠𝑠𝑖𝑖𝑛𝑛𝜔𝜔𝑡𝑡 𝑎𝑎𝑛𝑛𝑑𝑑 𝑌𝑌𝑡𝑡=𝐵𝐵cos𝜔𝜔𝑡𝑡−𝐴𝐴𝑠𝑠𝑖𝑖𝑛𝑛𝜔𝜔𝑡𝑡 where 𝜔𝜔 is a constant and A and B are
independent random variables both having zero mean and variance
𝜎𝜎 2 . 𝐴𝐴𝐴𝐴𝐴𝐴 𝑋𝑋(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌(𝑡𝑡)e jointly wide sense stationary.
Ans: The cross correlation of 𝑋𝑋(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌(𝑡𝑡) is
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐸𝐸[𝑋𝑋(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[(𝐴𝐴 cos 𝜔𝜔𝜔𝜔 + 𝐵𝐵 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠)(𝐵𝐵 cos 𝜔𝜔(𝑡𝑡 + 𝜏𝜏) − 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴(𝑡𝑡 + 𝜏𝜏))]
= 𝐸𝐸(𝐴𝐴𝐴𝐴)[cos 𝜔𝜔𝜔𝜔 cos 𝜔𝜔(𝑡𝑡 + 𝜏𝜏) − 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏)]
− 𝐸𝐸(𝐴𝐴2 )𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏) + 𝐸𝐸(𝐵𝐵2 )𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏)
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝐸𝐸(𝐴𝐴𝐴𝐴) = 𝐸𝐸(𝐴𝐴)𝐸𝐸(𝐵𝐵) = 0, 𝐸𝐸(𝐴𝐴2 ) = 𝐸𝐸(𝐵𝐵2 ) = 𝜎𝜎 2
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝜎𝜎 2 [𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏) − 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏)]
= 𝜎𝜎 2 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 − 𝑡𝑡 + 𝜏𝜏)
= 𝜎𝜎 2 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝜏𝜏)
Which implies that 𝑋𝑋(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌(𝑡𝑡) are jointly WSS processes.
(OR)
(b) State and prove Weiner-Khinchine Theorem.
Ans: Statement:
Let X(t) be a real WSS process with power density spectrum S XX (ω ) .
Let X T (t ) be a portion of the process X(t) in time interval –T to T. i.e.,
X (t ) ,−T < t < T
X T (t ) =
0 , elsewhere Let X (ω ) be the Fourier transform of
T
X T (t ) , then
S XX (ω ) =
lim 1
T → ∞ 2T
E X T (ω )
2
{ }
Proof: Given X T (ω ) is the Fourier transform of X T (t )
∞
∴ X T (ω ) = ∫X
−∞
T (t )e −iωt dt
T
= ∫X
−T
T (t )e −iωt dt
∫ X (t )e
−iωt
= dt
−T
T T
∫ X (t )e dt. ∫ X (t )e −iωt dt
i ωt
=
−T −T [ X (t ) is real]
T iω t T − i ωt
= ∫ X (t )e 1 dt . ∫ X (t )e 2 dt
1 1 2 2
−T −T
T T
= ∫ ∫ X (t ) X (t
−T −T
1 2 )e −iω ( t2 −t2 ) dt1dt 2
∴
lim
T →∞
E X T (ω ) =
2
[lim 1
T → ∞ 2T
]
T T
∫ ∫ X (t ) X (t
−T −T
1 2 )e −iω ( t2 −t2 ) dt1dt 2
∂t1 ∂t1 1 0
= =1
J = ∂t ∂τ 1 1
∂t 2 ∂t 2
∂t ∂τ
dt1 dt 2 = J dtdτ
The limits of t and –T and T
When t 2 = −T ,τ = −T − t and t 2 = T ,τ = T − t
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
] T −t T
∫ ∫R
−T −t −T
XX (t , t + τ )e −iωτ dtdτ
∴
lim 1
T → ∞ 2T
[
E X T (ω ) =
2 lim 1
T → ∞ 2T
]T −t T
R (τ )e −iωτ dtdτ∫ ∫
−T −t −T
XX
lim 1
= T −t T
T → ∞ 2T
∫R (τ )e dτ . ∫ dt
−iωτ
XX
−T −t −T
lim 1
= T −t
T → ∞ 2T
∫R
−T −t
XX (τ )e −iωτ dτ .2T
lim
= T −t
T →∞
∫R
−T −t
XX (τ )e −iωτ dτ
∞
= ∫R
−∞
XX (τ )e −iωτ dτ = S XX (ω )
, by definition.
∴ S XX (ω ) =lim E X T (ω ) [ 2
]
T →∞ 2T
Hence proved.
15 (a) (i) Show that the input {𝑋𝑋(𝑡𝑡)} is a WSS process for a linear system then output {𝑌𝑌(𝑡𝑡)} is
a WSS process. Also find 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏).
Ans: Let X(t) be a WSS process for a linear time invariant stable system with Y(t)
as the output process.
∞
Y (t ) = ∫ h(u ) X (t − u )du
Then −∞ where h(t ) is weighting function or unit impulse
response.
∞
∴ E [Y (t )] = ∫ E[h(u ) X (t − u )]du
−∞
∞
= ∫ h(u ) E[ X (t − u )]du
−∞
∫ h(u )du
Since the system is stable , −∞ is finite
∴ E [Y (t )] is a constant.
Now RYY (t , t + τ ) = E[Y (t )Y (t + τ )]
∞ ∞
= E[ ∫ h(u1 ) E[ X (t − u1 )]du1 ∫ h(u 2 ) E[ X (t + τ − u 2 )]du 2 ]
−∞ −∞
∞ ∞
= E[ ∫ ∫ h(u1 )h(u 2 ) X (t − u1 ) X (t + τ − u 2 )du1 du 2 ]
− ∞− ∞
∞ ∞
= ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) E[ X (t − u1 ) X (t + τ − u 2 )]du1 du 2
(ii) If {𝑋𝑋(𝑡𝑡)} is the input voltage to a circuit and {𝑌𝑌(𝑡𝑡)} is the output voltage, {𝑋𝑋(𝑡𝑡)} is a
stationary random process with 𝜇𝜇𝑋𝑋 = 0 and 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝑒𝑒 − 𝛼𝛼 |𝜏𝜏| .Find the mean 𝜇𝜇𝑌𝑌 and
the power spectrum 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) of the output if the power transfer function is given by
𝑅𝑅
𝐻𝐻(𝜔𝜔) =
𝑅𝑅+𝑖𝑖𝑖𝑖𝑖𝑖
Ans: ∞
Y (t ) = ∫ h(u ) X (t − u )du
(i) We know that −∞
∞
∴ E [Y (t )] = ∫ h(u ) E[ X (t − u )]du
−∞
Since X(t) is stationary with mean 0, E[ X (t )] = 0 for all t
∴ E[ X (t − u )] = 0 ∴ E[Y (t )] = 0
∞
S XX (ω ) = ∫R
−∞
XX (τ )e −iωτ dτ
(i) We know that
∞
∫e
−α τ
= e −iωτ dτ
−∞
0 ∞
= ∫ e ατ e −iωτ dτ + ∫ e −ατ e −iωτ dτ
−∞ 0
0 ∞
= ∫ e (α −iω )τ dτ + ∫ e (α +iω )τ dτ
−∞ 0
0 ∞
e (α −iω )τ e − (α +iω )τ
= +
α − iω −∞ − (α + iω ) 0
1 1
= [1 − 0] − [0 − 1]
α − iω α + iω
1 1 α + iω + α − iω
= + =
α − iω α + iω (α − iω )(α + iω )
2α
=
α +ω2
2
R
Given H (ω ) = .
R + iLω
We know that S yy (ω ) = H (ω ) S xx (ω )
2
R2 2α
= . 2
R + L ω α +ω2
2 2 2
1
First we shall write ( R + L ω )(α + ω ) as partial fraction, treating
2 2 2 2 2
∞ ∞
αR 2 L2 1 αR 2 1
∫ ∫
iτω
∴ RYY (τ ) = .e dω −
π (α L − R ) −∞ ( R + L ω )
2 2 2 2 2 2
π (α L − R ) −∞α +
2 2 2 2
∞ ∞
αR 2 L2 1 1 αR 2 1
=
π (α L − R ) L2
2 2 2 ∫ R2
.e iτω dω −
π (α L − R
2 2 2
) ∫α 2
+ω2
.e iτω d
−∞
( 2
+ω2) −∞
L
αR 2 L2 π −τ αR 2 π −α τ
∴ RYY (τ ) = e − e
π (α L − R ) R
2 2 2
π (α L − R ) α
2 2 2
L
αLR −τ R R2 −α τ
= e − 2 2 e
α L −R
2 2 2
L α L −R
2
αLR −τ R R2 −α τ
= e − e
R2 L
R2
L (α − 2 )
2 2
L (α − 2 )
2 2
L L
2
R R
α
L −τ R
e − 2 e
L −α τ
=
L
2
R R
α2 − α2 −
L L
R
− R τ R −α τ
L
= αe L − e
L
2
R
α −
2
L
(OR)
(b) (i) If 𝑌𝑌(𝑡𝑡) = 𝐴𝐴 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡), where A is a constant, 𝜃𝜃 is a random variable with
a uniform distribution (−𝜋𝜋, 𝜋𝜋), and {𝑁𝑁(𝑡𝑡)} is a band limited Gaussian white noise
𝑁𝑁0
𝑓𝑓𝑓𝑓𝑓𝑓 |𝜔𝜔 − 𝜔𝜔0 | < 𝜔𝜔𝐵𝐵
with power spectral density 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔) = � 2
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
Find the power spectral density of Y(t). Assume that {𝑁𝑁(𝑡𝑡)} and 𝜃𝜃 are independent.
Ans: 𝑌𝑌(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏) = [𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)][𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
= 𝐴𝐴2 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)
+ 𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)𝑁𝑁(𝑡𝑡 + 𝜏𝜏) + 𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)𝑁𝑁(𝑡𝑡)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐴𝐴2 E[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)] + 𝐸𝐸[𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
+ 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
+ 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡)]
Since 𝑁𝑁(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝜃𝜃 are independent.
By hypothesis 𝐸𝐸[𝑁𝑁(𝑡𝑡)] = 0, 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏) = 0
𝐴𝐴2
∴ 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝑐𝑐𝑐𝑐𝑐𝑐𝜔𝜔0 𝜏𝜏 + 𝑅𝑅𝑁𝑁𝑁𝑁 (𝜏𝜏)
2
𝐴𝐴2 ∞
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2 −∞
𝜋𝜋𝐴𝐴2
= [𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 ) + 𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 )] + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2
(ii) A system has an impulse response ℎ(𝑡𝑡) = 𝑒𝑒 − 𝛽𝛽𝛽𝛽 𝑈𝑈(𝑡𝑡), find the power spectral density
of the output Y(t) corresponding to the input X(t).
Ans: Given X (t ) is the input process and Y (t ) is the output process of a linear
system.
S (ω ) = H (ω ) S YY (ω )
2
We know that YY
Where H (ω ) = Fourier transform of the function h(t ) . But the unit step
function
0 , t < 0
U (t ) =
1 , t ≥ 0
0 ,t < 0
∴ h(t ) = − βt
e ,t ≥ 0
∞
∫ h(t )e
− i ωt
∴ H (ω ) = F [h(t )] = dt
−∞
∞
= ∫ e − βt e −iωt dt
0 h(t ) = 0 if t<0
∞
= ∫ e −( β +iω )t dt
0
∞
e − ( β + iω ) t
=
− ( β + iω ) 0
1
=− [0 − 1]
( β + iω )
1
=
( β + iω )
1 1
∴ H (ω ) = =
β + iω β 2 +ω2
1
∴ S YY (ω ) = .S XX (ω )
β +ω2
2
4
= � lim 25 + = √25 = 5
𝜏𝜏→∞ 1 + 6𝜏𝜏 2
𝐸𝐸�𝑋𝑋 2 (𝑡𝑡)� = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
= 25 + 4 = 29
2 (𝑡𝑡)�
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸�𝑋𝑋 − 𝐸𝐸(𝑋𝑋(𝑡𝑡))
29 − 52 = 4
8 Prove that the spectral density of a real random process is an even
function.
Ans: The spectral density function of a real random process is an even
function. 𝑖𝑖. 𝑒𝑒, 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = 𝑆𝑆𝑋𝑋𝑋𝑋 (−𝜔𝜔)
9 Define casual system.
Ans: A causal system (also known as a physical or
nonanticipative system) is a system where the output depends on
past and current inputs but not future inputs
10 Define transfer function of a system.
Ans: The output Y(t) is a linear time invariant system is fully determined
by the impulse response h(t). The Fourier Transform of the impulse
∞
response h(t) is defined by 𝐻𝐻(𝜔𝜔) = ∫−∞ ℎ(𝑡𝑡)𝑒𝑒 − 𝑗𝑗𝑗𝑗𝑗𝑗 𝑑𝑑𝑑𝑑. Here 𝐻𝐻(𝜔𝜔)
is called the transfer function of the system.
PART – B
11. (a) (i) 2𝑥𝑥, 0 ≤ 𝑥𝑥 ≤ 𝑏𝑏
The pdf of a random variable X is given by 𝑓𝑓(𝑥𝑥) = � for
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
what value of b is 𝑓𝑓(𝑥𝑥) a valid pdf?. Also find the cdf of the random
variable X with the above pdf.
Ans: Since ∫∞ 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = 1
−∞
𝑏𝑏
� 2𝑥𝑥 𝑑𝑑𝑑𝑑 = 1
0
𝑏𝑏
2𝑥𝑥 2
� � =1
2 0
𝑏𝑏 2 − 0 = 1, 𝑏𝑏 = ±1
𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏 = 1 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑎𝑎 < 𝑏𝑏
𝑥𝑥
The cdf of X is 𝐹𝐹(𝑥𝑥) = 𝑃𝑃(𝑋𝑋 ≤ 𝑥𝑥) = ∫−∞ 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
(𝑖𝑖) 𝐼𝐼𝐼𝐼 𝑥𝑥 < 0
𝑥𝑥
𝐹𝐹(𝑥𝑥) = � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑 = 0
−∞
(𝑖𝑖𝑖𝑖) 𝐼𝐼𝐼𝐼 0 < 𝑥𝑥 < 1
0 𝑥𝑥
𝐹𝐹(𝑥𝑥) = � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑 + � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
−∞ 0
𝑥𝑥
= 0 + � 2𝑥𝑥 𝑑𝑑𝑑𝑑
0
2 𝑥𝑥
2𝑥𝑥
=� � = 𝑥𝑥 2
2 0
(𝑖𝑖𝑖𝑖𝑖𝑖) 𝐼𝐼𝐼𝐼 𝑥𝑥 > 1
0 1 𝑥𝑥
𝐹𝐹(𝑥𝑥) = � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑 + � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑 + � 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
−∞ 0 1
2 1
2𝑥𝑥
=0+� � +0
2 0
=1
0, 𝑥𝑥 < 0
2
The cdf of X is 𝐹𝐹(𝑥𝑥) = �𝑥𝑥 , 0 < 𝑥𝑥 < 1
1, 𝑥𝑥 > 1
𝑞𝑞 𝑘𝑘 𝑝𝑝 𝑞𝑞 𝑘𝑘 𝑝𝑝
𝑃𝑃[𝑋𝑋 > 𝑘𝑘] = = = 𝑞𝑞 𝑘𝑘
1 − 𝑞𝑞 𝑝𝑝
Hence 𝑃𝑃[𝑋𝑋 > 𝑠𝑠 + 𝑡𝑡] = 𝑞𝑞 𝑠𝑠+𝑡𝑡
(OR)
(b) (i) Find the moment generating function of Poisson distribution and hence
find its mean and variance.
Ans: M X (t)=E[𝑒𝑒 𝑡𝑡𝑡𝑡 ] = ∑∞ 𝑡𝑡𝑡𝑡
𝑥𝑥=0 𝑒𝑒 𝑝𝑝(𝑥𝑥)
𝑒𝑒 − λ 𝑥𝑥
=∑∞
𝑥𝑥=0 𝑒𝑒
𝑡𝑡𝑡𝑡
λ
𝑥𝑥!
𝑥𝑥
� λ et �
=𝑒𝑒 − λ ∑∞
𝑥𝑥=0 𝑥𝑥!
2
λ et � λ et �
=𝑒𝑒 − λ �1 + + + ⋯.�
1! 2!
𝑡𝑡
−λ −λ e
= 𝑒𝑒 𝑒𝑒
M X (t)= 𝑒𝑒 λ (e t −1)
= �𝑒𝑒 λ (e
𝑑𝑑 t −1)
� 𝑀𝑀𝑋𝑋 (𝑡𝑡)�
𝑑𝑑𝑑𝑑
λ et �
𝑡𝑡=0 𝑡𝑡=0
= 𝑒𝑒 λ (e
0 −1)
λ e0
=λ
Mean= λ
=�𝑒𝑒 λ (e λ et . λ et + 𝑒𝑒 λ (e
𝑑𝑑 2 t −1) t −1)
�
𝑑𝑑𝑑𝑑
𝑀𝑀𝑋𝑋 (𝑡𝑡)� λ et �
𝑡𝑡=0 𝑡𝑡=0
2
E[X ]= λ + λ 2
Variance(X) =E[X2]+[𝐸𝐸[𝑋𝑋]]2
2 2
= λ + λ -� λ �
Var(X) = λ
(ii) In a normal distribution, 31% of items are under 45 and 8% of items over
64. Find the mean and the standard deviation of the distribution.
Ans: Let mean be 𝜇𝜇 and the standard deviation 𝜎𝜎
𝑋𝑋 − 𝜇𝜇
𝑍𝑍 =
𝜎𝜎
P(= X 0,= Y 1) 3 15 1
P( X= 0 Y= 1)= = =
P() 72 72 5
P(= X 1,= Y 1) 5 15 1
P( X= 1 Y= 1)= = =
P(Y = 1) 72 72 3
P(= X 2,= Y 1) 7 15 7
P( X= 2 Y= 1)= = =
P (Y = 1) 72 72 15
P (=X 0,= Y 2) 6 24 1
P ( X= 0 Y= 2)
= = =
P (Y = 2) 72 72 4
P (=X 1,= Y 2) 8 24 1
P ( X= 1 Y= 2)
= = =
P (Y = 2) 72 72 3
P ( X = 2, Y = 2) 10 24 5
P ( X= 2 Y= 2)
= = =
P(Y = 2) 72 72 12
P(= X 0,= Y 3) 9 33 9
P( X= 0 Y= 3)= = =
P (Y = 3) 72 72 33
P (=X 1,= Y 3) 11 33 1
P ( X= 1 Y= 3)= = =
P (Y = 3) 72 72 3
P (= X 2,= Y 3) 13 33 13
P ( X= 2 Y= 3)= = =
P (Y = 3) 72 72 33
Te conditional distribution of X Y = 2
X 0 1 2
P=
( X x=
Y 2) 6 24 1 8 24 1 10 24 5
= = =
72 72 4 72 72 3 72 72 12
Te conditional distribution of X Y = 3
X 0 1 2
P=
( X x=
Y 3) 9 1 13
33 3 33
Te conditional distribution of Y X = 0
X 0 1 2
P=
(Y y=
X 0) 3 18 1 6 18 1 9 18 1
= = =
72 72 6 72 72 3 72 72 2
(OR)
(b) (i) The joint pdf of a random variable (X,Y) is 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = 25𝑒𝑒 − 5𝑦𝑦 ; 0 < 𝑥𝑥 <
0.2, 𝑦𝑦 > 0. Find the covariance of X and Y..
Ans: The marginal density of X
∞
𝑓𝑓𝑋𝑋 (𝑥𝑥) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
−∞
∞
= � 25 𝑒𝑒 − 5𝑦𝑦 𝑑𝑑𝑑𝑑
0
∞
𝑒𝑒 − 5𝑦𝑦
= 25 � � =5
−5 0
The marginal density of X
∞
𝑓𝑓𝑌𝑌 (𝑦𝑦) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
−∞
0.2
= � 25𝑒𝑒 − 5𝑦𝑦 𝑑𝑑𝑑𝑑
0
= 25𝑒𝑒 − 5𝑦𝑦 [𝑥𝑥]0.2
0 = 25𝑒𝑒
− 5𝑦𝑦
(0.2 − 0)
− 5𝑦𝑦
= 5𝑒𝑒
𝐶𝐶𝐶𝐶𝐶𝐶(𝑋𝑋, 𝑌𝑌) = 𝐸𝐸(𝑋𝑋𝑋𝑋) − 𝐸𝐸(𝑋𝑋)𝐸𝐸(𝑌𝑌)
∞ 0.2
∞ 0.2
− 5𝑦𝑦
𝑥𝑥 2
= � 𝑦𝑦 25𝑒𝑒 � � 𝑑𝑑𝑑𝑑
2 0
0
∞
A; -2 1
B: -2 1
1 2 2 2
𝐸𝐸[𝐴𝐴] = � 𝑎𝑎𝑖𝑖 𝑝𝑝(𝑎𝑎𝑖𝑖 ) = (−2) � � + (1) � � = − + = 0.
3 3 3 3
1 2 4 2
𝐸𝐸[𝐴𝐴2 ] = � 𝑎𝑎𝑖𝑖2 𝑝𝑝(𝑎𝑎𝑖𝑖 ) = (−2)2 � � + (1)2 � � = + = 2.
3 3 3 3
1 2 2 2
𝐸𝐸[𝐵𝐵] = � 𝑏𝑏𝑖𝑖 𝑝𝑝(𝑏𝑏𝑖𝑖 ) = (−2) � � + (1) � � = − + = 0
3 3 3 3
1 2 4 2
𝐸𝐸[𝐵𝐵2 ] = � 𝑏𝑏𝑖𝑖2 𝑝𝑝(𝑏𝑏𝑖𝑖 ) = (−2)2 � � + (1)2 � � = + = 2.
3 3 3 3
Since A & B are independent RV's, E[AB ]= E[A] E[B]=0.
(ii) Suppose the customer arrive at a bank according to Poisson process with
mean rate of 3 per minute. Find the probability that during a time interval
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
of two minutes (1) exactly four customers arrive (2) greater than 4
customers arrive (3) fewer than 4 customers arrive.
Ans: Given 𝜆𝜆 = 3⁄𝑚𝑚𝑚𝑚𝑚𝑚, 𝑡𝑡 = 2
𝑒𝑒 − 𝜆𝜆𝜆𝜆 (𝜆𝜆𝜆𝜆)𝑛𝑛
𝑃𝑃[𝑋𝑋(𝑡𝑡) = 𝑛𝑛] = 𝑛𝑛 = 0,1,23, …
𝑛𝑛!
− 3×2 (3 𝑛𝑛
𝑒𝑒 × 2) 𝑒𝑒 −6 6𝑛𝑛
= =
𝑛𝑛! 𝑛𝑛!
−6
𝑒𝑒 6 4
(1) 𝑃𝑃[𝑋𝑋(2) = 4] = = 0.1338
4!
𝑒𝑒 −6 60 𝑒𝑒 −6 61 𝑒𝑒 −6 62 𝑒𝑒 −6 63
(2) 𝑃𝑃[𝑋𝑋(2) < 4] = + + +
0! 1! 2! 3!
= 𝑒𝑒 −6 [1 + 6 + 18 + 36]
= 0.1512
(3) 𝑃𝑃[𝑋𝑋(2) > 4] = 1 − [𝑃𝑃(𝑋𝑋 < 4) + 𝑃𝑃(𝑋𝑋 = 4)]
= 1 − (0.1338 + 0.1512) = 0.715
(OR)
(b) (i) A man either drives a car or catches a train to go office each day. He never
goes two days is just as likely to derive again as he is to travel by train.
Now suppose that on the first day of the week, the mean tossed a fair dice
and drove to work iff a 6 appeared. Find the probability that he takes a
train on the fourth day and the probability that he drives to work on the
fifth day.
Ans: Travel Pattern is a Markov chain with state space = (train, car)
The TPM of the chain is
T C
0 1
𝑇𝑇
𝑃𝑃 = �1 1�
𝐶𝐶
2 2
51
The initial state probability distribution is 𝑃𝑃(1) = � �
66
Since
𝑃𝑃(𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑐𝑐𝑐𝑐𝑐𝑐) = 𝑃𝑃(𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 6 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑜𝑜𝑜𝑜 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑) =
1
6
5
And 𝑃𝑃(𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡) =
6
51 0 1 1 11
𝑃𝑃(2) = 𝑃𝑃(1) 𝑃𝑃 = � � �1 1� = � �
66 12 12
2 2
1 11 0 1 11 13
(3) (2)
𝑃𝑃 = 𝑃𝑃 𝑃𝑃 = � � �1 1� = � �
12 12 24 24
2 2
11
𝑃𝑃(𝑡𝑡ℎ𝑒𝑒 𝑚𝑚𝑚𝑚𝑚𝑚 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑜𝑜𝑜𝑜 𝑡𝑡ℎ𝑒𝑒 𝑡𝑡ℎ𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑) =
24
The long run probability is limiting probability
Equation (2) and (3) are one and the same. Solve (1) and (2)
1
𝜋𝜋0 =
3
2
𝜋𝜋1 =
3
2
𝑃𝑃(𝑡𝑡ℎ𝑒𝑒 𝑚𝑚𝑚𝑚𝑚𝑚 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙 𝑟𝑟𝑟𝑟𝑟𝑟) =
3
𝑒𝑒 −⋋𝑡𝑡 (⋋𝑡𝑡)𝑛𝑛
{N(t)} is a process with 𝑃𝑃{𝑁𝑁(𝑡𝑡) = 𝑟𝑟} = , 𝑛𝑛 = 𝑜𝑜, 1,2 …
𝑛𝑛!
= 𝑒𝑒 −⋋𝑡𝑡 𝑐𝑐𝑐𝑐𝑐𝑐ℎ ⋋ 𝑡𝑡
−⋋𝑡𝑡
[⋋ 𝑡𝑡]1 [⋋ 𝑡𝑡]3 [⋋ 𝑡𝑡]5
= 𝑒𝑒 � + + …�
1! 3! 5!
= 𝑒𝑒 −⋋𝑡𝑡 𝑠𝑠𝑠𝑠𝑠𝑠ℎ ⋋ 𝑡𝑡
So {X(t)} is evolutionary.
14. (a) (i) Two random process {𝑋𝑋(𝑡𝑡)} and {𝑌𝑌(𝑡𝑡)} are defines as 𝑋𝑋(𝑡𝑡) =
𝐴𝐴 𝑐𝑐𝑐𝑐𝑐𝑐(𝜔𝜔𝜔𝜔 + 𝜃𝜃) and 𝑌𝑌(𝑡𝑡) = 𝐵𝐵 𝑠𝑠𝑠𝑠𝑠𝑠 (𝜔𝜔𝜔𝜔 + 𝜃𝜃) where A, B and 𝜔𝜔 are
constants and 𝜃𝜃 is uniformly distributed random variable over (0, 2𝜋𝜋).
Find the cross correlation function of {𝑋𝑋(𝑡𝑡)}and {𝑌𝑌(𝑡𝑡)}.
Ans: The cross correlation of 𝑋𝑋(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌(𝑡𝑡) is
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐸𝐸[𝑋𝑋(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏)]
= 𝐴𝐴𝐴𝐴 𝐸𝐸[cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) sin(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 𝜃𝜃)]
𝐴𝐴𝐴𝐴
= 𝐸𝐸[sin(𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 𝜃𝜃)cos (𝜔𝜔𝜔𝜔 + 𝜃𝜃)]
2
𝐴𝐴𝐴𝐴
= 𝐸𝐸[sin(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃) + 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠]
2
𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴
= 𝐸𝐸 �sin(2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃)] + 𝐸𝐸[𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠�
2 2
𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 2𝜋𝜋
= 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 + � 𝑠𝑠𝑠𝑠𝑠𝑠 (2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃) 𝑓𝑓(𝜃𝜃)𝑑𝑑𝑑𝑑
2 2 0
𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 2𝜋𝜋 1
= 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 + � 𝑠𝑠𝑠𝑠𝑠𝑠 (2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃) 𝑑𝑑𝑑𝑑
2 2 0 2𝜋𝜋
2𝜋𝜋
𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 [𝑐𝑐𝑐𝑐𝑐𝑐 (2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 2𝜃𝜃)]
= 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 + � �
2 4𝜋𝜋 2 0
𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴
= 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 + [𝑐𝑐𝑐𝑐𝑐𝑐 (2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔 + 4𝜋𝜋) − 𝑐𝑐𝑐𝑐𝑐𝑐 (2𝜔𝜔𝜔𝜔 + 𝜔𝜔𝜔𝜔)]
2 8𝜋𝜋
1 1 1 2
= + +
4 (1 − iω ) 2
(1 + iω ) 2
(1 + iω )(1 − iω )
∴ R XX (τ )
1 2
= F −1 [S XX (ω )] = F −1
1 1
+ +
4 (1 − iω ) 2 (1 + iω ) 2 (1 + ω )2
1
We know that F −1 = u (τ )τe −ατ , where u (t ) is
2
(α + iω )
unit step function, α > 0
2α
= u (τ )τeατ , F −1 2
1
Also F −1
−α τ
2 2
=e
(α − iω ) α +ω
1
4
[
∴ R XX (τ ) = u (τ )τe ατ + u (τ )τe −ατ + e
−α τ
]
1
[
= u (τ )τ (e ατ + e −ατ ) + e
4
−α τ
]
Since X(t) is WSS, average power PXX = E X (t )
2
[ ]
= R XX (0)
∴ Average power
= R XX (0)
1
= = 0.25
4
(OR)
(b) (i) If 𝑌𝑌(𝑡𝑡) = 𝑋𝑋(𝑡𝑡 + 𝑎𝑎) − 𝑋𝑋(𝑡𝑡 − 𝑎𝑎). Prove that 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 2𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 + 2𝑎𝑎) −
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 2𝑎𝑎) Hence prove that 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = 4 𝑠𝑠𝑠𝑠𝑠𝑠2 𝑎𝑎𝑎𝑎𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔).
Ans: Auto correlation function of Y(t) is given by
𝑌𝑌(𝑡𝑡) = 𝑋𝑋(𝑡𝑡 + 𝑎𝑎) − 𝑋𝑋(𝑡𝑡 − 𝑎𝑎)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝐸𝐸[𝑌𝑌(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[�𝑋𝑋(𝑡𝑡 + 𝑎𝑎) − 𝑌𝑌(𝑡𝑡 − 𝑎𝑎)��𝑋𝑋(𝑡𝑡 + 𝑎𝑎 + 𝜏𝜏) − 𝑋𝑋(𝑡𝑡 − 𝑎𝑎 + 𝜏𝜏)�]
= 𝐸𝐸��𝑋𝑋(𝑡𝑡 + 𝑎𝑎)𝑋𝑋(𝑡𝑡 + 𝑎𝑎 + 𝜏𝜏)] − 𝐸𝐸[𝑋𝑋(𝑡𝑡 + 𝑎𝑎)��𝑋𝑋(𝑡𝑡 − 𝑎𝑎 + 𝜏𝜏)]
− 𝐸𝐸[𝑋𝑋(𝑡𝑡 − 𝑎𝑎)𝑋𝑋(𝑡𝑡 − 𝑎𝑎 + 𝜏𝜏)��
+ 𝐸𝐸[𝑋𝑋(𝑡𝑡 − 𝑎𝑎)𝑋𝑋(𝑡𝑡 − 𝑎𝑎 + 𝜏𝜏)]
= 2𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) − 𝐸𝐸(𝑋𝑋(𝑡𝑡 + 𝑎𝑎)𝑋𝑋(𝑡𝑡 + 𝑎𝑎 + 𝜏𝜏 − 2𝑎𝑎))
− 𝐸𝐸(𝑋𝑋(𝑡𝑡 − 𝑎𝑎)𝑋𝑋(𝑡𝑡 − 𝑎𝑎 + 𝜏𝜏 + 2𝑎𝑎))
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 2𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) − 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 2𝑎𝑎) − 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 + 2𝑎𝑎)
The power spectral density of Y(t) is
∞
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = � 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞ ∞
= 2 � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 − � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 2𝑎𝑎)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞ −∞
∞
− � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 + 2𝑎𝑎)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 2𝑎𝑎)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
− � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 + 2𝑎𝑎)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
Let 𝜏𝜏 − 2𝑎𝑎 = 𝑢𝑢
⇒ 𝜏𝜏 = 𝑢𝑢 + 2𝑎𝑎
𝑑𝑑𝑑𝑑 = 𝑑𝑑𝑑𝑑
𝑎𝑎𝑎𝑎𝑎𝑎 𝜏𝜏 + 2𝑎𝑎 = 𝑣𝑣
⇒ 𝜏𝜏 = 𝑢𝑢 − 2𝑎𝑎
𝑑𝑑𝑑𝑑 = 𝑑𝑑𝑑𝑑
∞
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑢𝑢)𝑒𝑒 −𝑖𝑖𝑖𝑖 (𝑢𝑢+2𝑎𝑎) 𝑑𝑑𝑑𝑑
−∞
∞
− � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑣𝑣)𝑒𝑒 −𝑖𝑖𝑖𝑖 (𝑢𝑢−2𝑎𝑎) 𝑑𝑑𝑑𝑑
−∞
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − 𝑒𝑒 −𝑖𝑖𝑖𝑖 2𝑎𝑎 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − 𝑒𝑒 +𝑖𝑖𝑖𝑖 2𝑎𝑎 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)�𝑒𝑒 𝑖𝑖𝑖𝑖 2𝑎𝑎 + 𝑒𝑒 −𝑖𝑖𝑖𝑖 2𝑎𝑎 �
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) − 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)2𝑐𝑐𝑐𝑐𝑐𝑐2𝑎𝑎𝑎𝑎
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)(1 − 𝑐𝑐𝑐𝑐𝑐𝑐2𝑎𝑎𝑎𝑎)
= 2𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)2𝑠𝑠𝑠𝑠𝑠𝑠2 𝑎𝑎𝑎𝑎
∴ 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = 4𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)𝑠𝑠𝑠𝑠𝑠𝑠2 𝑎𝑎𝑎𝑎
(ii) The auto correlation function of the random telegraph signal process is
given by 𝑅𝑅(𝜏𝜏) = 𝑎𝑎2 𝑒𝑒 − 2𝛾𝛾|𝜏𝜏| . Determine the power density spectrum of the
Prepared by Department of Mathematics, Agni College of Technology, Chennai - 130
Agni college of Technology
Chennai – 130
4𝑎𝑎2 𝛾𝛾
=
4𝛾𝛾 2 + 𝜔𝜔 2
∞
15. (a) If {𝑋𝑋(𝑡𝑡)} is a WSS process and if 𝑌𝑌(𝑡𝑡) = ∫− ∞ ℎ(𝑢𝑢)𝑋𝑋(𝑡𝑡 − 𝑢𝑢)𝑑𝑑𝑑𝑑, Prove
that
(𝑖𝑖)𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) ∗ ℎ(−𝑡𝑡)
(𝑖𝑖𝑖𝑖) 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) ∗ ℎ(𝑡𝑡)where * denotes convolution.
(𝑖𝑖𝑖𝑖𝑖𝑖)𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)𝐻𝐻 ∗ (𝜔𝜔) where 𝐻𝐻 ∗ (𝜔𝜔) is the complex conjugate of
𝐻𝐻(𝜔𝜔).
(𝑖𝑖𝑖𝑖)𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)|𝐻𝐻(𝜔𝜔)|2
Ans: We know that
∞
(a) 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) ∫−∞ ℎ(𝛽𝛽)𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 − 𝛽𝛽) 𝑑𝑑𝑑𝑑
∞
𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 , 𝑡𝑡2 ) ∫−∞ ℎ(𝛽𝛽)𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 − 𝑡𝑡2 − 𝛽𝛽) 𝑑𝑑𝑑𝑑since X(t) is a WSS
process.
And 𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 − 𝑡𝑡2 − 𝛽𝛽) = 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 𝛽𝛽) 𝑤𝑤ℎ𝑒𝑒𝑒𝑒𝑒𝑒 𝜏𝜏 = 𝑡𝑡2 − 𝑡𝑡1
Hence
∞
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = � ℎ(𝛽𝛽)𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏 − 𝛽𝛽) 𝑑𝑑𝑑𝑑
−∞
⇒ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = ℎ(𝜏𝜏) ∗ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)-------------------(1)
(b) we know
∞
𝑅𝑅𝑌𝑌𝑌𝑌 (𝑡𝑡1 , 𝑡𝑡2 ) = � ℎ(𝛼𝛼)𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡1 − 𝑡𝑡2 − 𝛼𝛼) 𝑑𝑑𝑑𝑑
−∞
∞
= ∫−∞ ℎ(𝛼𝛼)𝑅𝑅𝑋𝑋𝑋𝑋 (−𝜏𝜏 − 𝛼𝛼) 𝑑𝑑𝑑𝑑since X(t) is a WSS process.
∞
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = � ℎ(𝛼𝛼)𝑅𝑅𝑋𝑋𝑋𝑋 (−𝜏𝜏 − 𝛼𝛼) 𝑑𝑑𝑑𝑑
−∞
⇒ 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = ℎ(−𝜏𝜏) ∗ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)---------------------(2)
(OR)
(b) A random process 𝑋𝑋(𝑡𝑡) is the input to a linear system whose impulse
response is ℎ(𝑡𝑡) = 2𝑒𝑒 − 𝑡𝑡 , 𝑡𝑡 ≥ 0. If the autocorrelation function of the
process is 𝑅𝑅(𝜏𝜏) = 𝑒𝑒 − 2|𝜏𝜏| , determine the cross correlation function 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)
between the input process 𝑋𝑋(𝑡𝑡) and the output 𝑌𝑌(𝑡𝑡) and the cross
correlation function 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) between the output process 𝑌𝑌(𝑡𝑡) and the
input process 𝑋𝑋(𝑡𝑡).
Ans: The cross correlation between input X(t) and output Y(t) to a linear
system is
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)ℎ(𝜏𝜏)
Taking Fourier transform we get
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)𝐻𝐻(𝜔𝜔)
Given 𝑅𝑅(𝜏𝜏) = 𝑒𝑒 − 2|𝜏𝜏|
∞
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = � 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
− 2|𝜏𝜏| −𝑖𝑖𝑖𝑖𝑖𝑖
= � 𝑒𝑒 𝑒𝑒 𝑑𝑑𝑑𝑑
−∞
∞
= � 𝑒𝑒 − 2|𝜏𝜏| (𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 − 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖) 𝑑𝑑𝑑𝑑
−∞
∞ ∞
− 2|𝜏𝜏|
= � 𝑒𝑒 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑 − 𝑖𝑖 � 𝑒𝑒 − 2|𝜏𝜏| 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑑𝑑𝑑𝑑
−∞ −∞
∞
− 2𝜏𝜏
= 2 � 𝑒𝑒 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑
0
𝑒𝑒 − 2𝜏𝜏
=2 [−2𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 + 𝜔𝜔𝜔𝜔𝜔𝜔𝜔𝜔𝜔𝜔𝜔𝜔]∞
0
4 + 𝜔𝜔 2
2
= [0 − (−2 + 0)]
4 + 𝜔𝜔 2
4
=
4 + 𝜔𝜔 2 ∞
𝐻𝐻(𝜔𝜔) = 𝐹𝐹[ℎ(𝑡𝑡)] = � ℎ(𝑡𝑡) 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞ ∞
− 𝑡𝑡 − 𝑖𝑖𝑖𝑖𝑖𝑖
= � 2𝑒𝑒 𝑒𝑒 𝑑𝑑𝑑𝑑 = 2 � 𝑒𝑒 − (1+𝑖𝑖𝑖𝑖 )𝑡𝑡 𝑑𝑑𝑑𝑑
0 0
∞
𝑒𝑒 − (1+𝑖𝑖𝑖𝑖 )𝑡𝑡
= 2� �
−(1 + 𝑖𝑖𝑖𝑖) 0
2
=
1 + 𝑖𝑖𝑖𝑖
4 2 8
∴ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = =
(4 + 𝜔𝜔 ) (1 + 𝑖𝑖𝑖𝑖) (2 + 𝑖𝑖𝑖𝑖)(2 − 𝑖𝑖𝑖𝑖)(1 + 𝑖𝑖𝑖𝑖)
2
Let
8 𝐴𝐴 𝐵𝐵 𝐶𝐶
= + +
(2 + 𝑖𝑖𝑖𝑖)(2 − 𝑖𝑖𝑖𝑖)(1 + 𝑖𝑖𝑖𝑖) (2 + 𝑖𝑖𝑖𝑖) (2 − 𝑖𝑖𝑖𝑖) (1 + 𝑖𝑖𝑖𝑖)
𝑒𝑒 − 𝑥𝑥 𝑥𝑥 ≥ 0
1. Show that the function 𝑓𝑓(𝑥𝑥) = � is the probability density function (pdf)
0 𝑥𝑥 < 0
of a random variable X.
Solution: Given 𝑓𝑓(𝑥𝑥) ≥ 0 and
∞ ∞
� 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = � 𝑒𝑒 − 𝑥𝑥 𝑑𝑑𝑑𝑑
−∞ 0
𝑒𝑒 − 𝑥𝑥 ∞
=� � = [0 + 1] = 1
−1 0
∞
Since 𝑓𝑓(𝑥𝑥) ≥ 0and ∫−∞ 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = 1. Hence the given 𝑓𝑓(𝑥𝑥) is pdf.
2. The mean and the variance of binomial distribution are 5 and 4. Determine the
distribution.
Solution: Let x be a binomial random variable with parameters n and p.
P(X=x)=n𝐶𝐶𝑥𝑥 𝑝𝑝 𝑥𝑥 𝑞𝑞 𝑛𝑛−𝑥𝑥 𝑥𝑥 = 0,1,2, …n
Mean=np, variance=npq
Given mean=5 and variance=4
𝑛𝑛𝑛𝑛 = 5 → (1), 𝑛𝑛𝑛𝑛𝑛𝑛 = 4 → (2)
(2) 𝑛𝑛𝑛𝑛𝑛𝑛 4
→ =
(1) 𝑛𝑛𝑛𝑛 5
4 4 1
𝑞𝑞 = , 𝑝𝑝 = 1 − 𝑞𝑞 = 1 − =
5 5 5
𝑛𝑛𝑛𝑛 = 5
1
𝑛𝑛 = 5
5
𝑛𝑛 = 25
1 𝑥𝑥 4 25−𝑥𝑥,
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) = 25𝐶𝐶𝑥𝑥 � � � � 𝑥𝑥 = 0,1,2, … 25
5 5
When 𝑥𝑥 → 0 ⇒ 𝑡𝑡 ⇢ 0, 𝑥𝑥 ⇢ ∞ ⇒ 𝑡𝑡 ⟶ ∞
𝑑𝑑𝑑𝑑
Put 𝑦𝑦 2 = 𝑣𝑣 , 2𝑦𝑦𝑦𝑦𝑦𝑦 = 𝑑𝑑𝑑𝑑 , 𝑖𝑖𝑖𝑖. , 𝑦𝑦𝑦𝑦𝑦𝑦 =
2
As 𝑦𝑦 → 0 ⇒ 𝑣𝑣 ⇢ 0, 𝑦𝑦 ⇢ ∞ ⇒ 𝑣𝑣 ⇢ ∞
∞ 𝑑𝑑𝑑𝑑 ∞ 𝑑𝑑𝑑𝑑
(1) ⇒ 𝑘𝑘 �∫0 𝑒𝑒 −𝑡𝑡 � �∫0 𝑒𝑒 −𝑣𝑣 �=1
2 2
𝑒𝑒 −𝑡𝑡 𝑒𝑒 −𝑣𝑣
𝑘𝑘 � �� �=1
2 2
−1 1
𝑘𝑘 �0 − � �� �0 − � �� = 1
2 −2
1 1
𝑘𝑘 � � � � = 1
2 2
𝑘𝑘
=1
4
∴ 𝑘𝑘 = 4
4. What is the angle between the two regression lines?
Solution: If 𝜃𝜃 is the angle between two regression lines , then
1− 𝑟𝑟 2 𝜎𝜎𝑥𝑥 𝜎𝜎𝑦𝑦
𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 = � � .
𝑟𝑟 𝜎𝜎𝑥𝑥 2 + 𝜎𝜎𝑦𝑦 2
in (0,t) and 𝑋𝑋(𝑡𝑡) = (−1)𝑁𝑁(𝑡𝑡) then {X(t)} is called a semi telegraph process.
𝑒𝑒 −⋋𝑡𝑡 (⋋𝑡𝑡)𝑛𝑛
{N(t)} is a process with 𝑃𝑃{𝑁𝑁(𝑡𝑡) = 𝑟𝑟} = , 𝑛𝑛 = 0,1,2 …
𝑛𝑛!
𝜋𝜋, |𝜔𝜔| ≤ 1
8. Find the auto correlation function whose spectral density is 𝑆𝑆(𝜔𝜔) = �
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
1 ∞
Solution: 𝑅𝑅(𝜏𝜏) = ∫ 𝑆𝑆(𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞
1 1
= ∫ 𝜋𝜋 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −1
1
𝜋𝜋 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖
= � �
2𝜋𝜋 𝑖𝑖𝑖𝑖 −1
1 𝑒𝑒 𝑖𝑖𝑖𝑖 −𝑒𝑒 − 𝑖𝑖𝑖𝑖
= � �
𝜏𝜏 2𝑖𝑖
1
= 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠
𝜏𝜏
PART B – (5 X 16 = 80)
11. (a) (i) A continuous random variable X that can assume any value between 𝑋𝑋 =
2 𝑎𝑎𝑎𝑎𝑎𝑎 𝑋𝑋 = 5 has a probability density function given by 𝑓𝑓(𝑥𝑥) = 𝑘𝑘(1 + 𝑥𝑥). Find
𝑃𝑃(𝑋𝑋 < 4).
∞
Solution: Since ∫−∞ 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑 = 1
5
∫2 𝑘𝑘(1 + 𝑥𝑥) 𝑑𝑑𝑑𝑑 = 1
5
(1+𝑥𝑥)2
𝑘𝑘 � � =1
2 2
36−9
𝑘𝑘 � �=1
2
(ii) If the probability that an applicant for a driver’s license will pass the road test on any
given trial is 0.8. What is the probability that he will finally pass the test on the 4thtrails.
Also find the probability that he will finally pass the test in less than 4 trials.
Solution: Let X denote the number of trials required to achieve the first success. Hence
X follows geometric distribution.
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) = 𝑞𝑞 𝑥𝑥−1 𝑝𝑝, 𝑥𝑥 = 1,2,3, …
Given 𝑝𝑝 = 0.8, 𝑞𝑞 = 0.2
(i) 𝑃𝑃(𝑋𝑋 = 4) = (0.2)4−1 (0.8) = 0.8(0.008) = 0.0064
(ii) 𝑃𝑃(𝑋𝑋 < 4) = 𝑃𝑃(𝑋𝑋 = 1) + 𝑃𝑃(𝑋𝑋 = 2) + 𝑃𝑃(𝑋𝑋 = 3)
= (0.2)1−1 (0.8) + (0.2)2−1 (0.8) + (0.2)3−1 (0.8)
= 0.8 (1 + 0.2 + (0.2)2 ) = 0.992
(OR)
(b) (i) Find the moment generating function of exponential distribution and hence find the
mean and variance of exponential distribution.
∞
Solution: 𝑀𝑀𝑋𝑋 (𝑡𝑡) = 𝐸𝐸[𝑒𝑒 𝑡𝑡𝑡𝑡 ] = ∫−∞ 𝑒𝑒 𝑡𝑡𝑡𝑡 𝑓𝑓(𝑥𝑥) 𝑑𝑑𝑑𝑑
∞
= ∫0 𝑒𝑒 𝑡𝑡𝑡𝑡 𝜆𝜆𝑒𝑒 − 𝜆𝜆𝜆𝜆 𝑑𝑑𝑑𝑑
∞
= 𝜆𝜆 ∫0 𝑒𝑒 − (𝜆𝜆−𝑡𝑡)𝑥𝑥 𝑑𝑑𝑑𝑑
∞
𝑒𝑒 − (𝜆𝜆 −𝑡𝑡)𝑥𝑥 1 𝜆𝜆
= 𝜆𝜆 � � = 𝜆𝜆 �0 − �=
−(𝜆𝜆−𝑡𝑡) 0 −(𝜆𝜆−𝑡𝑡) 𝜆𝜆−𝑡𝑡
𝑑𝑑 𝑑𝑑 𝜆𝜆 𝑑𝑑 𝜆𝜆
[𝑀𝑀𝑋𝑋 (𝑡𝑡)] = � � = 𝜆𝜆 (𝜆𝜆 − 𝑡𝑡)−1 = 𝜆𝜆(−1) (𝜆𝜆 − 𝑡𝑡)−2 (−1) =
𝑑𝑑𝑑𝑑 𝑑𝑑𝑑𝑑 𝜆𝜆−𝑡𝑡 𝑑𝑑𝑑𝑑 (𝜆𝜆−𝑡𝑡)2
𝑑𝑑 𝜆𝜆 𝜆𝜆 1
∴ 𝜇𝜇1′ = [𝑀𝑀𝑋𝑋 (𝑡𝑡)]� = �
(𝜆𝜆−𝑡𝑡)2 𝑡𝑡=0
= =
𝑑𝑑𝑑𝑑 𝑡𝑡=0 𝜆𝜆 2 𝜆𝜆
𝑑𝑑2 𝑑𝑑 𝜆𝜆 𝑑𝑑 2𝜆𝜆
[𝑀𝑀𝑋𝑋 (𝑡𝑡)] = � � = 𝜆𝜆 (𝜆𝜆 − 𝑡𝑡)−2 = 𝜆𝜆(−2) (𝜆𝜆 − 𝑡𝑡)−3 (−1) =
𝑑𝑑𝑡𝑡 2 𝑑𝑑𝑑𝑑 (𝜆𝜆 − 𝑡𝑡) 2 𝑑𝑑𝑑𝑑 (𝜆𝜆 − 𝑡𝑡)3
𝑑𝑑 2 2𝜆𝜆 2𝜆𝜆 2
∴ 𝜇𝜇2′ = [𝑀𝑀𝑋𝑋 (𝑡𝑡)]� = � = 2 = 2
𝑑𝑑𝑡𝑡 2
𝑡𝑡=0
(𝜆𝜆 − 𝑡𝑡) 𝑡𝑡=0 𝜆𝜆
3 𝜆𝜆
1
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 = 𝜇𝜇1′ =
𝜆𝜆
2 2 1 2 1
𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 = 𝜇𝜇2′ − 𝜇𝜇1′ = −� � =
𝜆𝜆 2 𝜆𝜆 𝜆𝜆 2
(ii) If the probability mass function of the random variable X is given by 𝑃𝑃[𝑋𝑋 = 𝑥𝑥] =
1 5
𝑘𝑘𝑥𝑥 3 , 𝑥𝑥 = 1,2,3,4. Find the value of 𝑘𝑘, 𝑃𝑃 �� < 𝑋𝑋 < �⁄𝑋𝑋 > 1�, mean and variance of
2 2
X.
Solution:
x 1 2 3 4
P(x) k 8k 27k 64k
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 � 𝑝𝑝(𝑥𝑥) = 1
𝑥𝑥=1
1 5
1 5 𝑃𝑃 � < 𝑋𝑋 < ∩ 𝑋𝑋 > 1�
𝑃𝑃 �� < 𝑋𝑋 < ��𝑋𝑋 > 1� = 2 2
2 2 𝑃𝑃(𝑋𝑋 > 1)
1 5
𝑃𝑃 � < 𝑋𝑋 < � 𝑃𝑃(𝑋𝑋 = 1) 𝑃𝑃(𝑋𝑋 = 1)
= 2 2 = =
𝑃𝑃(𝑋𝑋 > 1) 𝑃𝑃(𝑋𝑋 = 2) + 𝑃𝑃(𝑋𝑋 = 3) + 𝑃𝑃(𝑋𝑋 = 4) 1 − 𝑃𝑃(𝑋𝑋 = 1)
1
100 1
= =
1 99
1−
100
1 8 8 64
=1× +2× +3× +4×
100 100 100 100
1 16 24 256 297
= + + + =
100 100 100 100 100
4
2)
𝐸𝐸(𝑋𝑋 = � 𝑥𝑥 2 𝑝𝑝(𝑥𝑥)
𝑥𝑥=1
1 8 8 64
=1× + 22 × + 32 × + 42 ×
100 100 100 100
1 32 72 1024 1129
= + + + =
100 100 100 100 100
𝑉𝑉𝑉𝑉𝑉𝑉(𝑋𝑋) = 𝐸𝐸(𝑋𝑋 2 ) − [𝐸𝐸(𝑋𝑋)]2
1129 297 2
= −� �
100 100
= 11.29 − 8.82 = 2.47
12. (a) (i) If the joint probability distribution function of a two dimensional random variable
(1 − 𝑒𝑒 − 𝑥𝑥 )(1 − 𝑒𝑒 − 𝑦𝑦 )
(X,Y) is given by 𝐹𝐹(𝑥𝑥, 𝑦𝑦) = � ; 𝑥𝑥 > 0, 𝑦𝑦 > 0. Find the marginal
0 ; 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
densities of X and Y. Are X and Y independent? Find 𝑃𝑃[1 < 𝑋𝑋 < 3, 1 < 𝑌𝑌 < 2].
Solution: Given the cumulative distribution function
𝐹𝐹(𝑥𝑥, 𝑦𝑦) = (1 − 𝑒𝑒 − 𝑥𝑥 )(1 − 𝑒𝑒 − 𝑦𝑦 ), ; 𝑥𝑥 > 0, 𝑦𝑦 > 0
𝜕𝜕 2 𝐹𝐹 𝜕𝜕
The joint pdf is 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = = [(1 − 𝑒𝑒 − 𝑥𝑥 )𝑒𝑒 − 𝑦𝑦 ] = 𝑒𝑒 − 𝑥𝑥 𝑒𝑒 − 𝑦𝑦
𝜕𝜕𝜕𝜕𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕
𝑒𝑒 − 𝑦𝑦 ∞
= 𝑒𝑒 − 𝑥𝑥 � �
−1 0
= 𝑒𝑒 − 𝑥𝑥 , 𝑥𝑥 > 0
− 𝑦𝑦
𝑒𝑒 −𝑥𝑥 ∞
= 𝑒𝑒 � �
−1 0
= 𝑒𝑒 − 𝑦𝑦 , 𝑦𝑦 > 0
𝑁𝑁𝑁𝑁𝑁𝑁 𝑓𝑓𝑋𝑋 (𝑥𝑥)𝑓𝑓𝑌𝑌 (𝑦𝑦) = 𝑒𝑒 − 𝑥𝑥 𝑒𝑒 − 𝑦𝑦 = 𝑒𝑒 − (𝑥𝑥+𝑦𝑦) = 𝑓𝑓(𝑥𝑥, 𝑦𝑦)
∴ 𝑋𝑋 𝑎𝑎𝑎𝑎𝑎𝑎 𝑌𝑌 𝑎𝑎𝑎𝑎𝑎𝑎 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖.
𝑃𝑃[1 < 𝑋𝑋 < 3, 1 < 𝑌𝑌 < 2] = 𝑃𝑃(1 < 𝑋𝑋 < 3)𝑃𝑃(1 < 𝑌𝑌 < 2)
Since X and Y are independent.
2 2
= � 𝑓𝑓𝑋𝑋 (𝑥𝑥) 𝑑𝑑𝑑𝑑 � 𝑓𝑓𝑌𝑌 (𝑦𝑦) 𝑑𝑑𝑑𝑑
1 1
3 2
= � 𝑒𝑒 − 𝑥𝑥 𝑑𝑑𝑑𝑑 � 𝑒𝑒 − 𝑦𝑦 𝑑𝑑𝑑𝑑
1 1
−𝑥𝑥 3
𝑒𝑒 𝑒𝑒 −𝑦𝑦 2
=� � � �
−1 1 −1 1
= [𝑒𝑒 − 3 − 𝑒𝑒 − 1 ][𝑒𝑒 − 2 − 𝑒𝑒 − 1 ]
(ii) Find the coefficient of correlation between X and Y from the data given below.
X: 65 66 67 67 68 69 70 72
Y: 67 68 65 68 72 72 69 71
Solution:
𝑥𝑥 𝑦𝑦 𝑥𝑥 2 𝑦𝑦 2 𝑥𝑥𝑥𝑥
65 67 4225 4489 4355
66 68 4356 4624 4488
67 65 4489 4225 4355
67 68 4489 4624 4556
68 72 4624 5184 4896
69 72 4761 5184 4968
70 69 4900 4761 4830
1 1
𝜎𝜎𝑦𝑦 = � ∑ 𝑌𝑌 2 − (∑ 𝑌𝑌)2 = � × 38132 − (69)2 =2.345208
𝑛𝑛 8
∑ 𝑥𝑥𝑥𝑥 37560
𝐶𝐶𝐶𝐶𝐶𝐶(𝑋𝑋, 𝑌𝑌) = − 𝑋𝑋�𝑌𝑌� = − 68 × 69
𝑛𝑛 8
= 4695 − 4692 = 3
Coefficient of correlation is
𝐶𝐶𝐶𝐶𝐶𝐶(𝑋𝑋, 𝑌𝑌) 3
𝑟𝑟𝑋𝑋𝑋𝑋 = = = 0.603
𝜎𝜎𝑥𝑥 𝜎𝜎𝑦𝑦 2.12132 × 2.345208
(OR)
(b) (i) The two lines of regression are 8𝑋𝑋 − 10𝑌𝑌 + 66 = 0, 40𝑋𝑋 − 18𝑌𝑌 − 214 = 0.
The variance of X is 9. Find the mean values of X and Y. Also find the coefficient of
correlation between the variables X and Y.
Solution: Since both the regression lines pass through the point (𝑥𝑥̅ , 𝑦𝑦�)
8𝑥𝑥̅ − 10𝑦𝑦� + 66 = 0 … … … … … … (1)
40𝑥𝑥̅ − 18𝑦𝑦� − 214 = 0 … … … … … … (2)
Solving (1) and (2) we get
𝑥𝑥̅ = 13, 𝑦𝑦� = 17
Given the variance of X = Var(x) = 9 ⟹ 𝜎𝜎𝑥𝑥 = 3
The equations of regression lines can be written as
𝑦𝑦 = 0.8𝑥𝑥 + 6.6, 𝑥𝑥 = 0.45𝑦𝑦 + 5.35
Hence the regression coefficient of Y on X is
𝑟𝑟𝜎𝜎𝑦𝑦
𝑏𝑏𝑦𝑦𝑦𝑦 = = 0.8 … … … … … (3)
𝜎𝜎𝑥𝑥
𝜎𝜎𝑥𝑥 = 3 , 𝜎𝜎𝑦𝑦 = 4
The correlation coefficient is 𝑟𝑟 = 0.6.
(ii) Two random variables X and Y have the following joint probability density function.
𝑥𝑥 + 𝑦𝑦 ; 0 ≤ 𝑥𝑥 ≤ 1, 0 ≤ 𝑦𝑦 ≤ 1
𝑓𝑓(𝑥𝑥, 𝑦𝑦) = � Find the probability density function of the
0 ; 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
random variable 𝑈𝑈 = 𝑋𝑋𝑋𝑋.
𝑢𝑢
Solution: 𝑢𝑢 = 𝑥𝑥𝑥𝑥 , 𝑣𝑣 = 𝑦𝑦. Hence 𝑥𝑥 = and 𝑦𝑦 = 𝑣𝑣.
𝑣𝑣
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 1
0 1
𝐽𝐽 = �𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 � = � 𝑣𝑣
−𝑢𝑢 �=
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 𝑣𝑣
1
𝜕𝜕𝜕𝜕 𝜕𝜕𝜕𝜕 𝑣𝑣 2
13. (a) (i) Show that the process 𝑋𝑋(𝑡𝑡) = 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 where A and B are random
variables, is wide sense stationary process if 𝐸𝐸(𝐴𝐴) = 𝐸𝐸(𝐵𝐵) = 𝐸𝐸(𝐴𝐴𝐴𝐴) = 0, 𝐸𝐸(𝐴𝐴2 ) =
𝐸𝐸(𝐵𝐵2 ).
Solution: Given 𝑋𝑋(𝑡𝑡) = 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵
And the random variables A and B satisfy
𝐸𝐸(𝐴𝐴) = 𝐸𝐸(𝐵𝐵) = 0, 𝐸𝐸(𝐴𝐴𝐴𝐴) = 0, 𝐸𝐸(𝐴𝐴2 ) = 𝐸𝐸(𝐵𝐵2 ) = 𝑘𝑘
We have to prove
(𝑖𝑖)𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 0
(𝑖𝑖𝑖𝑖)𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) 𝑖𝑖𝑖𝑖 𝑎𝑎 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓 𝑜𝑜𝑜𝑜 𝜏𝜏
(𝑖𝑖) 𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸[𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 ] = 𝐸𝐸[𝐴𝐴]𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 + 𝐸𝐸[𝐵𝐵]𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 = 0 + 0 + 0 = 0
= 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐.
(𝑖𝑖𝑖𝑖)𝑅𝑅𝑋𝑋𝑋𝑋 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐸𝐸[𝑋𝑋(𝑡𝑡)𝑋𝑋(𝑡𝑡 + 𝜏𝜏)]
= 𝐸𝐸[(𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵)(�𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴(𝑡𝑡 + 𝜏𝜏) + 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵(𝑡𝑡 + 𝜏𝜏)�]
= 𝐸𝐸[𝐴𝐴2 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏) + 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴(𝑡𝑡 + 𝜏𝜏) + 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴(𝑡𝑡 + 𝜏𝜏) + 𝐵𝐵2 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡
+ 𝜏𝜏)]
= 𝐸𝐸[𝐴𝐴2 ]𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏) + 𝐸𝐸[𝐴𝐴𝐴𝐴]𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏) + 𝐸𝐸[𝐴𝐴𝐴𝐴]𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏)
+ 𝐸𝐸[𝐵𝐵2 ] 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏)
= 𝑘𝑘 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏) + 0 + 0 + 𝑘𝑘 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏)
= 𝑘𝑘[𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 + 𝜏𝜏) + 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠(𝑡𝑡 + 𝜏𝜏)]
= 𝑘𝑘𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(𝑡𝑡 − 𝜏𝜏 − 𝑡𝑡) = 𝑘𝑘𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐(−𝜏𝜏) = 𝑘𝑘𝑘𝑘𝑘𝑘𝑘𝑘𝑘𝑘𝑘𝑘, which depends on 𝜏𝜏
Hence by (i) & (ii) X(t) is a WSS process.
(ii) There are 2 white marbles in Urn A and 3 red marbles in Urn B. At each step of
the process, a marble is selected from each urn and the 2 marbles selected are
interchanged. The state of the related Markov chain is the number of red marbles
in Urn A after the interchange. What is the probability that there are 2 red marbles
in Urn A after 3 steps? In the long run, what the probability that there are 2 red
marbles in Urn A?
Ans: The Markov chain{𝑋𝑋𝑛𝑛 } has state space 0,1,2 since the number of marbles in
Urn A is always 2 and the number of red marbles may be 0,1,2.
0 1 0
⎡1 1 1⎤
⎢ ⎥
𝑃𝑃 = ⎢6 2 3⎥
⎢ 2 1⎥
⎣0 3 3⎦
The initial distribution 𝑃𝑃(0) = [1 0 0], since there is no red marble and so the
probability of 0 red marble is 1.
0 1 0
1 1 1
(ii) 𝑃𝑃(1) = 𝑃𝑃(0) 𝑃𝑃 = [1 0 0] � 6 2 3� = [0 1 0]
2 1
0
3 3
0 1 0
⎡1 1 1⎤
⎢ ⎥ 1 1 1
𝑃𝑃(2) = 𝑃𝑃(1) 𝑃𝑃 = [0 1 0] ⎢6 2 3⎥ = � �
⎢ 2 1⎥ 6 2 3
⎣0 3 3⎦
0 1 0
1 1 1
1 1 1 1 23 5
𝑃𝑃(3) = 𝑃𝑃(2) 𝑃𝑃 = �6 2 3
� �6 2 3� = �12 36 18
�
2 1
0
3 3
5
𝑃𝑃(2 𝑟𝑟𝑟𝑟𝑟𝑟 𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 𝑖𝑖𝑖𝑖 𝑈𝑈𝑈𝑈𝑈𝑈 𝐼𝐼 𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎 3 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠) =
18
0 1 0
1 1 1
And ΠP = Π ⇒ [𝜋𝜋0 𝜋𝜋1 𝜋𝜋2 ] � 6 2 3� = [𝜋𝜋0 𝜋𝜋1 𝜋𝜋2 ]
2 1
0
3 3
𝜋𝜋1 𝜋𝜋1 2𝜋𝜋2 𝜋𝜋1 𝜋𝜋2
⇒� 𝜋𝜋0 + + + � = [𝜋𝜋0 𝜋𝜋1 𝜋𝜋2 ]
6 2 3 3 3
𝜋𝜋1 𝜋𝜋1 𝜋𝜋2 𝜋𝜋1
⇒ 𝜋𝜋0 = + = 𝜋𝜋2 ⇒ 𝜋𝜋2 =
6 3 3 2
Sub in (1)
𝜋𝜋 1 𝜋𝜋 1
+ 𝜋𝜋1 + =1
6 2
10 𝜋𝜋 1 6
⇒ = 1 ⇒ 𝜋𝜋1 =
6 10
1 6 1 1 6 3
𝜋𝜋0 = × = ; 𝜋𝜋2 = × =
6 10 10 2 10 10
1 6 3
Π=� �
10 10 10
3
∴ 𝑃𝑃( 2 𝑟𝑟𝑟𝑟𝑟𝑟 𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 𝑖𝑖𝑖𝑖 𝑈𝑈𝑈𝑈𝑈𝑈 𝐼𝐼 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙 𝑟𝑟𝑟𝑟𝑟𝑟) =
10
(OR)
(ii) Find the power spectral density of a random binary transmission process where
|𝜏𝜏|
autocorrelation function is 𝑅𝑅(𝜏𝜏) = �1 − ; |𝜏𝜏| ≤ 𝑇𝑇.
𝑇𝑇
|𝜏𝜏|
Solution: Given 𝑅𝑅(𝜏𝜏) = �1 − ; |𝜏𝜏| ≤ 𝑇𝑇
𝑇𝑇
We know that
∞
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = � 𝑅𝑅(𝜏𝜏)𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
𝑇𝑇 |𝜏𝜏|
= � �1 − � 𝑒𝑒 −𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−𝑇𝑇 𝑇𝑇
𝑇𝑇 |𝜏𝜏|
= � �1 − � (𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 − 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖) 𝑑𝑑𝑑𝑑
−𝑇𝑇 𝑇𝑇
𝑇𝑇 |𝜏𝜏|
= � �1 − � (𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 − 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖) 𝑑𝑑𝑑𝑑
−𝑇𝑇 𝑇𝑇
𝑇𝑇
𝜏𝜏
= 2 � �1 − � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑
0 𝑇𝑇
𝜏𝜏 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 1 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑇𝑇
= 2 ��1 − � � � − �− � �− ��
𝑇𝑇 𝜔𝜔 𝑇𝑇 𝜔𝜔 2 0
𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 1
= 2 ��0 − 2
� − �0 − ��
𝑇𝑇𝜔𝜔 𝑇𝑇𝑇𝑇 2
2
= (1 − 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐)
𝑇𝑇𝜔𝜔 2
𝜔𝜔𝜔𝜔
4 𝑠𝑠𝑠𝑠𝑠𝑠2
= 2
𝑇𝑇𝜔𝜔 2
(OR)
𝜔𝜔 2 +9
(b)(i) If the power spectral density of a continuous process is 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = Find
𝜔𝜔 4 +5𝜔𝜔 2 +4
𝜔𝜔2 + 9
=
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
1 ∞
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = � 𝑆𝑆 (𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
The mean square value is given by
𝐸𝐸[𝑋𝑋 2 (𝑡𝑡)] = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
1 ∞
= � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
∞
1
= 2 � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑 ∵ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) 𝑖𝑖𝑖𝑖 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
2𝜋𝜋 0 𝑋𝑋𝑋𝑋
1 ∞ 𝜔𝜔2 + 9
= � 𝑑𝑑𝑑𝑑
𝜋𝜋 0 (𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
𝑢𝑢 + 9 = 𝐴𝐴(𝑢𝑢 + 4) + 𝐵𝐵(𝑢𝑢 + 1)
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑓𝑓𝑓𝑓𝑓𝑓 𝐴𝐴 𝑎𝑎𝑎𝑎𝑎𝑎 𝐵𝐵, 𝑤𝑤𝑤𝑤 𝑔𝑔𝑔𝑔𝑔𝑔
8 5
𝐴𝐴 = , 𝐵𝐵 = −
3 3
8 5
𝜔𝜔2 + 9 −3
∴ = 3 +
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4) (𝑢𝑢 + 1) (𝑢𝑢 + 4)
8 5
2 (𝑡𝑡)]
1 ∞ 3
∞
3
𝐸𝐸[𝑋𝑋 = �� 𝑑𝑑𝑑𝑑 − � 𝑑𝑑𝑑𝑑�
𝜋𝜋 0 (𝜔𝜔 2 + 1) 0 (𝜔𝜔 2 + 4)
1 8 5 1 𝜔𝜔 ∞
= � [𝑡𝑡𝑡𝑡𝑡𝑡−1 𝜔𝜔]∞
0 − × �𝑡𝑡𝑡𝑡𝑡𝑡
−1
� �
𝜋𝜋 3 3 2 2 0
36
25𝜏𝜏 2 + 36 𝜏𝜏 2 �25 + 2 � 25
= � lim = �lim 𝜏𝜏 =� = √4 = 2
2
𝜏𝜏→∞ 6.25𝜏𝜏 + 4 𝜏𝜏→∞ 2 4 6.25
𝜏𝜏 �6.25 + 2 �
𝜏𝜏
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣 = 𝐸𝐸�𝑋𝑋 2 (𝑡𝑡)� = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
0 + 36
= =9
0+4
𝑉𝑉𝑉𝑉𝑉𝑉[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸�𝑋𝑋 2 (𝑡𝑡)� − 𝐸𝐸(𝑋𝑋(𝑡𝑡))
= 9 − 22 = 5
15. (a)(i) If the input to a time invariant stable line system is a wide sense stationary
process. Prove that the output will also be a wide sense stationary process.
Solution: Let X(t) be a WSS process for a linear time invariant stable system with Y(t) as the
output process.
∞
Y (t ) = ∫ h(u ) X (t − u )du
−∞
Then where h(t ) is weighting function or unit impulse response.
∞
∴ E [Y (t )] = ∫ E[h(u ) X (t − u )]du
−∞
∞
= ∫ h(u ) E[ X (t − u )]du
−∞
∴ E[ X (t − u )] = µ X
∞ ∞
∴ E [Y (t )] = ∫ h(u )µ X du = µ X ∫ h(u )du
−∞ −∞
∫ h(u )du
−∞
Since the system is stable , is finite
∴ E [Y (t )] is a constant.
Now RYY (t , t + τ ) = E[Y (t )Y (t + τ )]
∞ ∞
= E[ ∫ h(u1 ) E[ X (t − u1 )]du1 ∫ h(u 2 ) E[ X (t + τ − u 2 )]du 2 ]
−∞ −∞
∞ ∞
= E[ ∫ ∫ h(u1 )h(u 2 ) X (t − u1 ) X (t + τ − u 2 )du1 du 2 ]
− ∞− ∞
∞ ∞
= ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) E[ X (t − u1 ) X (t + τ − u 2 )]du1 du 2
Since X(t) is a WSS process, auto correlation function is only a function of time
difference
∞ ∞
∴ RYY (t , t + τ ) = ∫ ∫ h(u )h(u
− ∞− ∞
1 2 ) R XX (τ + u1 − u 2 )du1 du 2
∞
=
Y (t ) ∫ X (t − α )h(α )dα
−∞
∞
∴ X (t + τ )Y (t ) = ∫ X (t + τ ) X (t − α )h(α )dα
−∞
∞
= ∫R
−∞
XX (τ − β )h(− β )d β
=
i.e., RXY (τ ) RXX (τ ) * h(−τ ) (1)
RYX (τ ) = RXX (τ ) * h(τ ) (1a )
∞
Y (t )Y (t − τ=
) ∫ X (t − α )Y (t − τ )h(α )dα
−∞
∞
E{Y (t )Y (t − τ=
)} ∫R
−∞
XY (t − α )h(α )dα
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔)
In terms of 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔).
∞
Solution: 𝐻𝐻(𝜔𝜔) = 𝐹𝐹[ℎ(𝑡𝑡)] == ∫−∞ ℎ(𝑡𝑡) 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
𝑇𝑇 1 1 𝑇𝑇
= ∫0 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 = ∫0 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
𝑇𝑇 𝑇𝑇
𝑇𝑇
1 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 1 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 1
= � � = � − �
𝑇𝑇 −𝑖𝑖𝑖𝑖 0 𝑇𝑇 −𝑖𝑖𝑖𝑖 −𝑖𝑖𝑖𝑖
1
=− �𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 − 1�
𝑖𝑖𝑖𝑖𝑖𝑖
1
= �1 − 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 �
𝑖𝑖𝑖𝑖𝑖𝑖
1
= [1 − (𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 − 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖)]
𝑖𝑖𝑖𝑖𝑖𝑖
(ii) Given 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝐴𝐴𝑒𝑒 − 𝑎𝑎|𝜏𝜏| and ℎ(𝑡𝑡) = 𝑒𝑒 − 𝛽𝛽𝛽𝛽 𝑢𝑢(𝑡𝑡) where 𝑢𝑢(𝑡𝑡) = {1; 𝑡𝑡 ≥ 0 . Find the
power spectral density of the output y(t).
∞
Solution: 𝐻𝐻(𝜔𝜔) = 𝐹𝐹[ℎ(𝑡𝑡)] = ∫−∞ ℎ(𝑡𝑡) 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
∞
= � 𝑒𝑒 − 𝛽𝛽𝛽𝛽 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
0
∞
= � 𝑒𝑒 − (𝛽𝛽 +𝑖𝑖𝑖𝑖 )𝑡𝑡 𝑑𝑑𝑑𝑑
0
∞
𝑒𝑒 − (𝛽𝛽 +𝑖𝑖𝑖𝑖 )𝑡𝑡
=� �
− (𝛽𝛽 + 𝑖𝑖𝑖𝑖) 0
1
=
(𝛽𝛽 + 𝑖𝑖𝑖𝑖)
2𝐴𝐴𝐴𝐴 1 1
= � 2 − 2 �
𝛽𝛽2 2
− 𝑎𝑎 𝑎𝑎 + 𝜔𝜔 2 (𝛽𝛽 + 𝜔𝜔 2 )
Since 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝐹𝐹 −1 [𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔)], the inverse Fourier transform of 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔)
𝑎𝑎𝑎𝑎 2𝛽𝛽 𝐴𝐴 2𝑎𝑎
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝐹𝐹 −1 � 2 �− 2 𝐹𝐹 −1 � 2 �
𝛽𝛽(𝑎𝑎2 − 𝛽𝛽 )
2 2
(𝛽𝛽 + 𝜔𝜔 ) (𝑎𝑎 − 𝛽𝛽 )
2 𝑎𝑎 + 𝜔𝜔 2
𝑎𝑎𝑎𝑎 𝐴𝐴 2𝑎𝑎
= 𝑒𝑒 − 𝛽𝛽 |𝜏𝜏| − (𝑎𝑎 2 𝑒𝑒 − 𝑎𝑎|𝜏𝜏| since𝐹𝐹�𝑒𝑒 − 𝑎𝑎|𝜏𝜏| � =
𝛽𝛽 (𝑎𝑎 2 −𝛽𝛽 2 ) −𝛽𝛽 2 ) 𝑎𝑎 2 +𝜔𝜔 2
Ans:
𝑃𝑃(𝑋𝑋 + 𝑌𝑌 ≤ 1) = � 𝑓𝑓(𝑥𝑥, 𝑦𝑦) 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
𝑅𝑅
1 1−𝑦𝑦
1
=�� 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
4
0 0
1
1 1−𝑦𝑦
= �[𝑥𝑥]0 𝑑𝑑𝑑𝑑
4
0
1
1
= �(1 − 𝑦𝑦) 𝑑𝑑𝑑𝑑
4
0
1
1 𝑦𝑦 2
= �𝑦𝑦 − �
4 2 0
1 1
= �1 − �
4 2
1 1
= ×
4 2
1
=
8
= 𝐸𝐸[𝑌𝑌(𝑡𝑡 + 𝜏𝜏)𝑋𝑋(𝑡𝑡)]
= 𝐸𝐸[𝑌𝑌(𝑢𝑢)𝑋𝑋(𝑢𝑢 − 𝜏𝜏)] Put 𝑢𝑢 = 𝑡𝑡 + 𝜏𝜏 ⟹ 𝑡𝑡 = 𝑢𝑢 − 𝜏𝜏
= 𝑅𝑅𝑌𝑌𝑌𝑌 (−𝜏𝜏)
∴ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 𝑅𝑅𝑌𝑌𝑌𝑌 (−𝜏𝜏)
= 𝑞𝑞 𝑘𝑘 𝑝𝑝[1 + 𝑞𝑞 + 𝑞𝑞 2 + ⋯ ]
= 𝑞𝑞 𝑘𝑘 𝑝𝑝(1 − 𝑞𝑞)−1
𝑞𝑞 𝑘𝑘 𝑝𝑝 𝑞𝑞 𝑘𝑘 𝑝𝑝
𝑃𝑃[𝑋𝑋 > 𝑘𝑘] = = = 𝑞𝑞 𝑘𝑘
1 − 𝑞𝑞 𝑝𝑝
Hence 𝑃𝑃[𝑋𝑋 > 𝑠𝑠 + 𝑡𝑡] = 𝑞𝑞 𝑠𝑠+𝑡𝑡
(ii) In a certain city, the daily consumption of electric power in millions of Kilowatt-
hours can be considered as a random variable following gamma distribution with
1
parameters 𝜆𝜆 = , 𝛼𝛼 = 3. If the power plant in this city has a daily capacity of 12
2
million Kilowatt-hours, what is the probability that this supply of power will be
insufficient on any given day?
Ans: Let X denote the daily consumption of power in millions of kilowatt-hours.
1
Given X follows gamma distribution with parameter 𝜆𝜆 = , 𝛼𝛼 = 3.
2
The pdf of X is given by
∝−1
𝜆𝜆 𝑒𝑒 − 𝜆𝜆𝜆𝜆 ( 𝜆𝜆𝜆𝜆 )
𝑓𝑓(𝑥𝑥) = 𝑥𝑥 ≥ 0
Γ𝛼𝛼
1 − 𝑥𝑥2 𝑥𝑥 2
𝑒𝑒 � �
=2 2 = 1 𝑒𝑒 − 𝑥𝑥2 𝑥𝑥 2
Γ3 16
Probability for insufficient supply = 𝑃𝑃(𝑋𝑋 > 12)
∞ ∞
1 − 𝑥𝑥 2
= � 𝑓𝑓(𝑥𝑥)𝑑𝑑𝑑𝑑 = � 𝑒𝑒 2 𝑥𝑥 𝑑𝑑𝑑𝑑
16
12 12
∞
𝑥𝑥 𝑥𝑥 𝑥𝑥
1 2 𝑒𝑒 −2 𝑒𝑒 −2 𝑒𝑒 −2
= �𝑥𝑥 − (2𝑥𝑥) +2× �
16 1 1 2 1 3
− �− � �− �
2 2 2 12
1
= [(0 − 0 + 0) − (−144 × 2 × 𝑒𝑒 −6 ) − 24 × 4 × 𝑒𝑒 −6 − (8 × 2
16
× 𝑒𝑒 −6 )]
𝑒𝑒 −6
= [288 + 96 + 16]
16
= 𝑒𝑒 −6 × 25
= 0.06195
(OR)
(b) (i) A coin is biased so that a head is twice as likely to appear as a tail. If the coin
tossed 6 times, find the probabilities of getting (1) exactly 2 heads (2) at least 3
heads (3) at most 4 heads
12. (a) (i) Let (X,Y) be a two dimensional non-negative continuous random variable having
−(𝑥𝑥 2 +𝑦𝑦 2 )
the joint density 𝑓𝑓(𝑥𝑥, 𝑦𝑦) = � 4𝑥𝑥𝑥𝑥 𝑒𝑒 𝑥𝑥 ≥ 0, 𝑦𝑦 ≥ 0 Find the densu,ity
0, 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
function of 𝑈𝑈 = �(𝑋𝑋 2 + 𝑌𝑌 2 )
2
𝑥𝑥 3 𝑥𝑥 3
𝐶𝐶 � [�𝑥𝑥 3 − � − (−𝑥𝑥 3 − )]𝑑𝑑𝑑𝑑 = 1
𝑥𝑥=0 2 2
2 2
3
2𝑥𝑥 4 1
𝐶𝐶 � 2𝑥𝑥 𝑑𝑑𝑑𝑑 = 1 ⟹ 𝐶𝐶 � � ⟹ 𝐶𝐶 =
0 4 0 8
1
𝑓𝑓𝑋𝑋,𝑌𝑌 𝑦𝑦) = �8 𝑥𝑥(𝑥𝑥 − 𝑦𝑦): 0 < 𝑥𝑥 < 2, −𝑥𝑥 < 𝑦𝑦 < 𝑥𝑥
(𝑥𝑥,
0 ∶ 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
𝑥𝑥 1 1 𝑥𝑥 2
(2) 𝑓𝑓𝑥𝑥 (𝑥𝑥) = ∫−𝑥𝑥 𝑥𝑥(𝑥𝑥 − 𝑦𝑦)𝑑𝑑𝑑𝑑 = ∫−𝑥𝑥 (𝑥𝑥 − 𝑥𝑥𝑥𝑥) 𝑑𝑑𝑑𝑑 =
8 8
2 𝑥𝑥 3
1 2 𝑥𝑥𝑦𝑦 𝑥𝑥 𝑥𝑥 3 1
�𝑥𝑥 𝑦𝑦 − � = ��𝑥𝑥 3 − � − �−𝑥𝑥 3 − �� = 𝑥𝑥 3 ; 0 < 𝑥𝑥 < 2.
8 2 −𝑥𝑥 2 2 4
1
𝑓𝑓(𝑥𝑥,𝑦𝑦) 𝑥𝑥(𝑥𝑥−𝑦𝑦) 1
(3)𝑓𝑓𝑌𝑌 ⁄𝑋𝑋 (𝑦𝑦⁄𝑥𝑥) = = 8
𝑥𝑥 3
= (𝑥𝑥 − 𝑦𝑦); −𝑥𝑥 < 𝑦𝑦 < 𝑥𝑥.
𝑓𝑓 𝑋𝑋 (𝑥𝑥) 2𝑥𝑥 2
4
∞
(4)𝑓𝑓𝑌𝑌 (𝑦𝑦) = ∫−∞ 𝑓𝑓(𝑥𝑥, 𝑦𝑦)𝑑𝑑𝑑𝑑
2
⎧� 1 𝑥𝑥(𝑥𝑥 − 𝑦𝑦)𝑑𝑑𝑑𝑑 𝑖𝑖𝑖𝑖 − 2 ≤ 𝑦𝑦 ≤ 0
⎪ 8
= −𝑦𝑦𝑥𝑥
⎨ 1
⎪ � 𝑥𝑥(𝑥𝑥 − 𝑦𝑦) 𝑑𝑑𝑑𝑑 𝑖𝑖𝑖𝑖 0 ≤ 𝑦𝑦 ≤ 2
⎩ −𝑥𝑥 8
3 2 2
⎧1 �𝑥𝑥 − 𝑥𝑥 𝑦𝑦� 𝑖𝑖𝑖𝑖 − 2 ≤ 𝑦𝑦 ≤ 0
⎪8 3 2 −𝑦𝑦
= 2
⎨ 1 𝑥𝑥 3 𝑥𝑥 2
⎪ � − 𝑦𝑦� 𝑖𝑖𝑖𝑖 0 ≤ 𝑦𝑦 ≤ 2
⎩ 8 3 2 𝑦𝑦
3 3
⎧1 �8 − 2𝑦𝑦 − �− 𝑦𝑦 − 𝑦𝑦 �� 𝑖𝑖𝑖𝑖 − 2 ≤ 𝑦𝑦 ≤ 0
⎪8 3 3 2
=
⎨ 1 8 𝑦𝑦 3 𝑦𝑦 3
⎪ � − 2𝑦𝑦 − � − �� 𝑖𝑖𝑖𝑖 0 ≤ 𝑦𝑦 ≤ 2
⎩ 8 3 3 2
1 𝑦𝑦 5 3
− + 𝑦𝑦 𝑖𝑖𝑖𝑖 − 2 ≤ 𝑦𝑦 ≤ 0
∴ 𝑓𝑓𝑌𝑌 (𝑦𝑦) = �3 4 48
1 𝑦𝑦 1 3
− + 𝑦𝑦 𝑖𝑖𝑖𝑖 0 ≤ 𝑦𝑦 ≤ 2
3 4 48
(OR)
(b) (i) Three balls are drawn at random without replacement from a box containing 2
white, 3 red and 4 black balls. If X denotes the number of white balls drawn and
Y denotes the number of red balls drawn, find the joint probability distribution of
(X,Y).
Ans: Given that X denotes the number of white balls drawn and Y denotes the
3 1 0 0
84
(ii) In a probability destroyed laboratory record only the lines of regressions and
variance of X are available. The regression equations are 8𝑋𝑋 − 10𝑌𝑌 + 66 =
0 𝑎𝑎𝑎𝑎𝑎𝑎 40𝑋𝑋 − 18𝑌𝑌 = 214, 𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣𝑣 𝑜𝑜𝑜𝑜 𝑋𝑋 = 9. Find (1) the correlation
coefficient between X and Y (2) Mean values of X and Y (3) variance of Y.
Ans: Since both the regression lines pass through the point (𝑥𝑥̅ , 𝑦𝑦�)
8𝑥𝑥̅ − 10𝑦𝑦� + 66 = 0 … … … … … … (1)
40𝑥𝑥̅ − 18𝑦𝑦� − 214 = 0 … … … … … … (2)
Solving (1) and (2) we get
𝑥𝑥̅ = 13, 𝑦𝑦� = 17
Given the variance of X = Var(x) = 9 ⟹ 𝜎𝜎𝑥𝑥 = 3
The equations of regression lines can be written as
𝑦𝑦 = 0.8𝑥𝑥 + 6.6, 𝑥𝑥 = 0.45𝑦𝑦 + 5.35
Hence the regression coefficient of Y on X is
𝑟𝑟𝜎𝜎𝑦𝑦
𝑏𝑏𝑦𝑦𝑦𝑦 = = 0.8 … … … … … (3)
𝜎𝜎𝑥𝑥
The regression coefficient of X on Y is
𝑟𝑟𝜎𝜎𝑥𝑥
𝑏𝑏𝑥𝑥𝑥𝑥 = = 0.45 … … … … … (4)
𝜎𝜎𝑦𝑦
Multiplying (3) and (4)
𝑟𝑟 2 = 0.8 × 0.45 = 0.36
𝑟𝑟 = �𝑏𝑏𝑦𝑦𝑦𝑦 × �𝑏𝑏𝑥𝑥𝑥𝑥 ⟹ 𝑟𝑟 = 0.6
0.8 𝜎𝜎𝑥𝑥 0.8 ×3
Now 𝜎𝜎𝑦𝑦 = = =4
𝑟𝑟 0.6
𝜎𝜎𝑥𝑥 = 3 , 𝜎𝜎𝑦𝑦 = 4
The correlation coefficient is 𝑟𝑟 = 0.6.
13. (a) (i) Show that the random process 𝑋𝑋(𝑡𝑡) = 𝐴𝐴 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) is a wide sense stationary,
where A and 𝜔𝜔 are constants and 𝜃𝜃 is uniformly distributed on the interval
(0,2𝜋𝜋)
Ans: Travel Pattern is a Markov chain with state space = (train, car)
The TPM of the chain is
T C
0 1
𝑇𝑇
𝑃𝑃 = �1 1�
𝐶𝐶
2 2
51
The initial state probability distribution is 𝑃𝑃(1) = � �
66
1
Since 𝑃𝑃(𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑐𝑐𝑐𝑐𝑐𝑐) = 𝑃𝑃(𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 6 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑜𝑜𝑜𝑜 𝑡𝑡ℎ𝑒𝑒 𝑑𝑑𝑑𝑑𝑑𝑑) =
6
5
And 𝑃𝑃(𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡) =
6
51 0 1 1 11
𝑃𝑃(2) = 𝑃𝑃(1) 𝑃𝑃 = � � �1 1� = � �
66 12 12
2 2
1 11 0 1 11 13
𝑃𝑃(3) = 𝑃𝑃(2) 𝑃𝑃 = � � �1 1� = � �
12 12 24 24
2 2
11
𝑃𝑃(𝑡𝑡ℎ𝑒𝑒 𝑚𝑚𝑚𝑚𝑚𝑚 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑜𝑜𝑜𝑜 𝑡𝑡ℎ𝑒𝑒 𝑡𝑡ℎ𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑) =
24
The long run probability is limiting probability
0 1
And ΠP = Π ⇒ [𝜋𝜋0 𝜋𝜋1 ] � 1 1� = [𝜋𝜋0 𝜋𝜋1 ]
2 2
𝜋𝜋 1
= 𝜋𝜋0 ---------(2)
2
𝜋𝜋 1
+ 𝜋𝜋0 = 𝜋𝜋1 -----------(3)
2
Equation (2) and (3) are one and the same. Solve (1) and (2)
1
𝜋𝜋0 =
3
2
𝜋𝜋1 =
3
2
𝑃𝑃(𝑡𝑡ℎ𝑒𝑒 𝑚𝑚𝑚𝑚𝑚𝑚 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑏𝑏𝑏𝑏 𝑐𝑐𝑐𝑐𝑐𝑐 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙 𝑟𝑟𝑟𝑟𝑟𝑟) =
3
(OR)
(b) (i) Prove that the difference of two independent Poisson processes is not a Poisson
process.
Ans: Let 𝑋𝑋1 (𝑡𝑡)𝑎𝑎𝑎𝑎𝑎𝑎 𝑋𝑋2 (𝑡𝑡) be two Poisson processes with parameter
𝜆𝜆1 𝑎𝑎𝑎𝑎𝑎𝑎 𝜆𝜆2
Let 𝑋𝑋(𝑡𝑡) = 𝑋𝑋1 (𝑡𝑡) − 𝑋𝑋2 (𝑡𝑡)
𝐸𝐸[𝑋𝑋(𝑡𝑡)] = 𝐸𝐸[ 𝑋𝑋1 (𝑡𝑡) − 𝑋𝑋2 (𝑡𝑡)]
= 𝜆𝜆1 𝑡𝑡 − 𝜆𝜆2 𝑡𝑡
= (𝜆𝜆1 − 𝜆𝜆2 )𝑡𝑡
Now
𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 1
= 2 �(1 − 𝜏𝜏) − (−1) �− ��
𝜔𝜔 𝜔𝜔 2 0
𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 1
= 2 ��0 − 2 � − �0 − 2 ��
𝜔𝜔 𝜔𝜔
1 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
= 2� 2 − 2 �
𝜔𝜔 𝜔𝜔
2
= 2 [1 − 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐]
𝜔𝜔
4 𝜔𝜔
∴ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = 2 𝑠𝑠𝑠𝑠𝑠𝑠2 � �
𝜔𝜔 2
(ii) The cross power spectrum of real processes X(t) and Y(t) is given by 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) =
𝑎𝑎 + 𝑖𝑖𝑖𝑖𝑖𝑖, 𝑖𝑖𝑖𝑖 |𝜔𝜔| < 1
� Find the cross correlation function.
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
Ans: 𝑎𝑎 + 𝑖𝑖𝑖𝑖𝑖𝑖, 𝑖𝑖𝑖𝑖 |𝜔𝜔| < 1
Given 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = �
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
1 ∞
∴ 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = � 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞
1 1
= � (𝑎𝑎 + 𝑖𝑖𝑖𝑖𝑖𝑖) 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −1
1 1 1
= � 𝑎𝑎 𝑒𝑒 𝑑𝑑𝑑𝑑 + � 𝑖𝑖𝑖𝑖𝑖𝑖 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
𝑖𝑖𝑖𝑖𝑖𝑖
2𝜋𝜋 −1 −1
𝑖𝑖𝑖𝑖𝑖𝑖 1 1
1 𝑒𝑒 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖
= �𝑎𝑎 � � + 𝑖𝑖𝑖𝑖 �𝜔𝜔 −1 � �
2𝜋𝜋 𝑖𝑖𝑖𝑖 𝑖𝑖𝑖𝑖 (𝑖𝑖𝑖𝑖)2
−1 −1
Put 𝑡𝑡 − 𝜏𝜏 = 𝑃𝑃 ⇒ 𝑡𝑡 = 𝑃𝑃 + 𝜏𝜏
(2) The mean square value of the random process may be obtained from
the auto correlation function, 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) by putting 𝜏𝜏 = 0
= 𝐸𝐸[𝑋𝑋 2 (𝑡𝑡)]
𝜔𝜔2 + 9
=
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
1 ∞
𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = � 𝑆𝑆 (𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
The mean square value is given by
𝐸𝐸[𝑋𝑋 2 (𝑡𝑡)] = 𝑅𝑅𝑋𝑋𝑋𝑋 (0)
1 ∞
= � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑋𝑋𝑋𝑋
∞
1
= 2 � 𝑆𝑆 (𝜔𝜔) 𝑑𝑑𝑑𝑑 ∵ 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) 𝑖𝑖𝑖𝑖 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒
2𝜋𝜋 0 𝑋𝑋𝑋𝑋
1 ∞ 𝜔𝜔2 + 9
= � 𝑑𝑑𝑑𝑑
𝜋𝜋 0 (𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4)
𝑢𝑢 + 9 = 𝐴𝐴(𝑢𝑢 + 4) + 𝐵𝐵(𝑢𝑢 + 1)
𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 𝑓𝑓𝑓𝑓𝑓𝑓 𝐴𝐴 𝑎𝑎𝑎𝑎𝑎𝑎 𝐵𝐵, 𝑤𝑤𝑤𝑤 𝑔𝑔𝑔𝑔𝑔𝑔
8 5
𝐴𝐴 = , 𝐵𝐵 = −
3 3
8 5
𝜔𝜔2 + 9 −3
∴ = 3 +
(𝜔𝜔 2 + 1)(𝜔𝜔 2 + 4) (𝑢𝑢 + 1) (𝑢𝑢 + 4)
8 5
2 (𝑡𝑡)]
1 ∞ 3
∞
3
𝐸𝐸[𝑋𝑋 = �� 2
𝑑𝑑𝑑𝑑 − � 2
𝑑𝑑𝑑𝑑�
𝜋𝜋 0 (𝜔𝜔 + 1) 0 (𝜔𝜔 + 4)
1 8 5 1 𝜔𝜔 ∞
= � [𝑡𝑡𝑡𝑡𝑡𝑡−1 𝜔𝜔]∞
0 − × �𝑡𝑡𝑡𝑡𝑡𝑡 −1
� �
𝜋𝜋 3 3 2 2 0
1 8 5
= � [𝑡𝑡𝑡𝑡𝑡𝑡−1 (∞) − 𝑡𝑡𝑡𝑡𝑡𝑡−1 (0)] − [𝑡𝑡𝑡𝑡𝑡𝑡−1 (∞) − 𝑡𝑡𝑡𝑡𝑡𝑡−1 (0)]�
𝜋𝜋 3 6
1 8 𝜋𝜋 5 𝜋𝜋 1 𝜋𝜋 8 5
= � × − × �= × � − �
𝜋𝜋 3 2 6 2 𝜋𝜋 2 3 6
11
=
12
15. (a) (i) If 𝑌𝑌(𝑡𝑡) = 𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡), where A is a constant, 𝜃𝜃 is a random variable
with uniform distribution in (−𝜋𝜋, 𝜋𝜋)and N(t) is a band-limited Gaussian white
𝑁𝑁0
𝑓𝑓𝑓𝑓𝑓𝑓 |𝜔𝜔 − 𝜔𝜔0 | < 𝜔𝜔𝐵𝐵
noise with a power spectral density 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔) = � 2 Fnd the
0, 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒ℎ𝑒𝑒𝑒𝑒𝑒𝑒
power spectral density of Y(t). Assume that N(t) and 𝜃𝜃 are independent.
Ans: 𝑌𝑌(𝑡𝑡)𝑌𝑌(𝑡𝑡 + 𝜏𝜏) = [𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)][𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)
+ 𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
2
= 𝐴𝐴 cos(𝜔𝜔𝜔𝜔 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃) + 𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)
+ 𝐴𝐴 cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)𝑁𝑁(𝑡𝑡 + 𝜏𝜏) + 𝐴𝐴 cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)𝑁𝑁(𝑡𝑡)
𝑅𝑅𝑌𝑌𝑌𝑌 (𝑡𝑡, 𝑡𝑡 + 𝜏𝜏) = 𝐴𝐴2 E[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃) cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)]
+ 𝐸𝐸[𝑁𝑁(𝑡𝑡)𝑁𝑁(𝑡𝑡 + 𝜏𝜏)] + 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 𝑡𝑡 + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏)]
+ 𝐴𝐴 𝐸𝐸[cos(𝜔𝜔0 (𝑡𝑡 + 𝜏𝜏) + 𝜃𝜃)] 𝐸𝐸[𝑁𝑁(𝑡𝑡)]
Since 𝑁𝑁(𝑡𝑡) 𝑎𝑎𝑎𝑎𝑎𝑎 𝜃𝜃 are independent.
By hypothesis 𝐸𝐸[𝑁𝑁(𝑡𝑡)] = 0, 𝐸𝐸[𝑁𝑁(𝑡𝑡 + 𝜏𝜏) = 0
𝐴𝐴2
∴ 𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = 𝑐𝑐𝑐𝑐𝑐𝑐𝜔𝜔0 𝜏𝜏 + 𝑅𝑅𝑁𝑁𝑁𝑁 (𝜏𝜏)
2
𝐴𝐴2 ∞
𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = � 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2 −∞
𝜋𝜋𝐴𝐴2
= [𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 ) + 𝛿𝛿(𝜔𝜔 − 𝜔𝜔0 )] + 𝑆𝑆𝑁𝑁𝑁𝑁 (𝜔𝜔)
2
(ii) A linear time invariant system has a impulse response ℎ(𝑡𝑡) = 𝑒𝑒 −𝛽𝛽𝛽𝛽 𝑈𝑈(𝑡𝑡). Find the
power spectral density of the output Y(t) corresponding to the input X(t).
∞
Ans:
𝐻𝐻(𝜔𝜔) = 𝐹𝐹[ℎ(𝑡𝑡)] = � ℎ(𝑡𝑡) 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞
∞
= � 𝑒𝑒 − 𝛽𝛽𝛽𝛽 𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
0
∞
= � 𝑒𝑒 − (𝛽𝛽 +𝑖𝑖𝑖𝑖 )𝑡𝑡 𝑑𝑑𝑑𝑑
0
∞
𝑒𝑒 − (𝛽𝛽 +𝑖𝑖𝑖𝑖 )𝑡𝑡
=� �
− (𝛽𝛽 + 𝑖𝑖𝑖𝑖) 0
1
=
(𝛽𝛽 + 𝑖𝑖𝑖𝑖)
1 1
|𝐻𝐻(𝜔𝜔)|2 = � �= 2
(𝛽𝛽 + 𝑖𝑖𝑖𝑖) (𝛽𝛽 + 𝜔𝜔 2 )
Power spectral density of X(t) is 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = 𝐹𝐹[𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏)]
∴ 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = |𝐻𝐻(𝜔𝜔)|2 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)
1
= 2 𝑆𝑆 (𝜔𝜔)
(𝛽𝛽 + 𝜔𝜔 2 ) 𝑋𝑋𝑋𝑋
(OR)
(b) (i) Assume a random process X(t) is given as input to a system with transfer
function 𝐻𝐻(𝜔𝜔) = 1 𝑓𝑓𝑓𝑓𝑓𝑓 − 𝜔𝜔0 < 𝜔𝜔 < 𝜔𝜔0 . If the autocorrelation function of the
𝑁𝑁
input is 0 𝛿𝛿(𝜏𝜏), find the autocorrelation function of the output process.
2
Ans: Given X(t) is the input process to a system to a system with system transfer
𝑁𝑁
function 𝐻𝐻(𝜔𝜔) = 1 𝑎𝑎𝑎𝑎𝑎𝑎 𝑅𝑅𝑋𝑋𝑋𝑋 (𝜏𝜏) = 0 𝛿𝛿(𝜏𝜏)
2
∞
𝑁𝑁0
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) = � 𝛿𝛿(𝜏𝜏)𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
−∞ 2
𝑁𝑁0 ∞ 𝑁𝑁0
= � 𝛿𝛿(𝜏𝜏)𝑒𝑒 − 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑 = ×1 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠[𝛿𝛿(𝜏𝜏)] = 1
2 −∞ 2
𝑁𝑁0
𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔) =
2
If the output process is Y(t) then 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = |𝐻𝐻(𝜔𝜔)|2 𝑆𝑆𝑋𝑋𝑋𝑋 (𝜔𝜔)
𝑁𝑁0 𝑁𝑁0
∴ 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = 12 = , − 𝜔𝜔0 < 𝜔𝜔 < 𝜔𝜔0
∞
2 2
1
𝑅𝑅𝑌𝑌𝑌𝑌 (𝜏𝜏) = � 𝑆𝑆 (𝜔𝜔)𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑
2𝜋𝜋 −∞ 𝑌𝑌𝑌𝑌
1 𝜔𝜔 0 𝑁𝑁0 𝑖𝑖𝑖𝑖𝑖𝑖
= � 𝑒𝑒 𝑑𝑑𝑑𝑑
2𝜋𝜋 −𝜔𝜔 0 2
𝜔𝜔
𝑁𝑁0 𝑒𝑒 𝑖𝑖𝑖𝑖𝑖𝑖 0 𝑁𝑁0 𝑒𝑒 𝑖𝑖𝑖𝑖 𝜔𝜔 0 − 𝑒𝑒 −𝑖𝑖𝑖𝑖 𝜔𝜔 0
= � � = � �
4𝜋𝜋 𝑖𝑖𝑖𝑖 −𝜔𝜔 4𝜋𝜋 𝑖𝑖𝑖𝑖
0
𝑁𝑁0 𝑒𝑒 𝑖𝑖𝑖𝑖 𝜔𝜔 0 − 𝑒𝑒 −𝑖𝑖𝑖𝑖𝜔𝜔 0
= � �
2𝜋𝜋𝜋𝜋 𝑖𝑖2
𝑁𝑁0
∴ 𝑆𝑆𝑌𝑌𝑌𝑌 (𝜔𝜔) = 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝜔𝜔0
2𝜋𝜋𝜋𝜋