Académique Documents
Professionnel Documents
Culture Documents
X0
P(Y0|X0)
P(X0)
Y0
P(Y0)
P(Y1|X0)
P(Y0|X1)
X1
P(Y1|X1)
Y1
P(X1)
P(Y1)
Forward transition probabilities of the noisy binary symmetric
channel.
Binary Symmetric Channel (BSC). Transmitted source symbols
Xi; i=0,1.
Received symbols Yj; j=0,1;, Given P(Xi), source
probability;
P(Yj| Xi) = Pe if ij;
P(Yj| Xi) = 1- Pe if i=j;
where Pe is the given error rate.
i=0,1.
j=0,1.
j=0,1.
Step 3:
i=0,1.
j=0,1.
Equivocation
Represents the destructive effect of noise, or additional
information needed to make the reception correct
x
Noisy channel
transmitter
y
receiver
Noiseless channel
Hypothetical
observer
observer
H ( x | y ) p( xy )log p( x | y )
x
Information transfer
Information in noiseless
system
(source entropy)
H(y) H(y|x)
P(x)
0.7
0.7
0.3
0.3
P(y)
0.6
0.4
0.4
0.6
P(x|y)
5/6
1/2
1/2
1/6
P(y|x)
5/7
2/7
2/3
1/3
P(xy)
0.5
0.2
0.2
0.1
I(xy) terms
0.126
-0.997
0.147
-0.085
H(x|y)
I(xy)
H(y|x)
H(y)
H(x)
Equivocation : H ( x | y ) p( xy ) log p( x | y )
H ( y | x ) p( xy ) log p( y | x )
p( x | y )
]
p( x ) p( y )
Quantity
Definition
Source information
I ( X i ) log 2 P( X i )
Received
information
Mutual information
Average mutual
information
I ( Y j ) log 2 P( Y j )
P( X i | Y j )
I ( X i ,Y j ) log 2
P( X i )
I ( X ,Y ) P( X i ,Y j ) log 2
X
P( X i ,Y j ) log 2
X
P( X i | Y j )
P( X i )
P( Y j | X i )
P( Y j )
P( X i ,Y j ) log 2
X
Source entropy
H ( X ) P( X i ) log 2 P( X i )
Destination entropy
H ( Y ) P( Y j ) log 2 P( Y j )
Equivocation
H ( X | Y )
P( X ,Y
Error entropy
H ( Y | X )
P( X ,Y
)log 2 P( X i | Y j )
) log 2 P( Y j | X i )
P( X i ,Y j )
P( X i )P( Y j )
Channel capacity
C=max I(xy)
that is maximum information transfer
Binary Symmetric Channels
The noise in the system is random, then the probabilities of
errors in 0 and 1 is the same. This is characterised by a
single value p of binary error probability.
Channel capacity of this channel
I ( xy ) H ( x ) H ( x | y )
H ( y ) p( xy ) log p(y|x)
H ( y ) p( x )
x
p( y | x ) log p(y|x)
H ( y ) p( x ) p( y | x ) log p(y|x)
x
y
p
p(0)
x
(transmit)
p(1)=1-p(0)
p
1
y
(receive)
p 1 p