Vous êtes sur la page 1sur 13

Binary Symmetric channel (BSC) is idealised model used

for noisy channel.


binary (0,1)
symmetric p( 01) =p(10)

X0

P(Y0|X0)

P(X0)

Y0
P(Y0)
P(Y1|X0)

P(Y0|X1)
X1

P(Y1|X1)

Y1

P(X1)
P(Y1)
Forward transition probabilities of the noisy binary symmetric
channel.
Binary Symmetric Channel (BSC). Transmitted source symbols
Xi; i=0,1.
Received symbols Yj; j=0,1;, Given P(Xi), source
probability;
P(Yj| Xi) = Pe if ij;
P(Yj| Xi) = 1- Pe if i=j;
where Pe is the given error rate.

Calculate the average mutual information (the average amount of


source information acquired per received symbol, as distinguished
for that per source symbol, which was given by the entropy H(X)).
Step 1: P(Yj,Xi)=P(Xi)P(Yj|Xi);
Step 2: P(Yj) =X P(Yj,Xi) ;

i=0,1.

j=0,1.
j=0,1.

(Logically the probability of receiving a particular Yj is the


sum of all joint probabilities over the range of Xi. (i.e. the prob of
receiving 1 is the sum of the probability of sending 1 receiving 1
plus the probability of sending 0, receiving 1. that is, the sum of
the probability of sending 1 and receiving correctly plus the
probability of sending 0, receiving wrongly. )

Step 3:

I(Yj,Xi) = log {P(Xi|Yj)/P(Xi) }


=log {P(Yj,Xi)/ [P(Xi)P(Yj)]} ;

i=0,1.

j=0,1.

(This quantifies the amount of information conveyed, when Xi


is transmitted, and Yj is received. Over a perfect noiseless
channel, this is self information of Xi, because each received
symbol uniquely identifies a transmitted symbol with P(Xi|
Yj)=1; If it is very noisy, or communication breaks down
P(Yj,Xi)=P(Xi)P(Yj), this is zero, no information has been
transferred).
Step 4: I(X,Y)= I(Y ,X ) = X Y P(Yj,Xi)log{P(Xi|Yj)/P(Xi)}
= X Y

P(Yj,Xi) log {P(Yj,Xi)/ [P(Xi)P(Yj)]}

Equivocation
Represents the destructive effect of noise, or additional
information needed to make the reception correct
x

Noisy channel

transmitter

y
receiver

Noiseless channel
Hypothetical
observer

The observer looks at the transmitted and received digits; if


they are the same, reports a 1, if different a 0.

observer

The information sent by the observer is easily evaluated as


-[p(0)logp(0)+p(1)logp(1)] applied to the binary string. The
probability of 0 is just the channel error probability.
Example: A binary system produces Marks and Spaces
with equal probabilities, 1/8 of all pulses being received in
error. Find the information sent by the observer.
The information sent by observer is
-[7/8log (7/8)+1/8log(1/8)]=0.54 bits
since the input information is 1 bit/symbol, the net
information is 1-0.54=0.46 bits, agreeing with previous
results.

The noise in the system has destroyed 0.55 bits of information,


or that the equivocation is 0.55 bits.
General expression for equivocation
Consider a specific pair of transmitted and received digits {x,y}
1. Noisy channel probability change p(x)p(x|y)
2. Receiver : probability correction p(x|y)1
The information provided by the observer
=-log( p(x|y) )
Averaging over all pairs

probability of a given pair

General expression for equivocation

H ( x | y ) p( xy )log p( x | y )
x

The information transferred via the noisy channel (in the


absence of the observer)

I(xy) H(x) H(x | y )

Information transfer
Information in noiseless
system
(source entropy)

Information loss due to noise


(equivocation)

H(y) H(y|x)

Example: A binary system produces Marks with probability of


0.7 and Spaces with probability 0.3, 2/7 of the Marks are
received in error and 1/3 of the Spaces. Find the information
transfer using the expression for equivocation.
x

P(x)

0.7

0.7

0.3

0.3

P(y)

0.6

0.4

0.4

0.6

P(x|y)

5/6

1/2

1/2

1/6

P(y|x)

5/7

2/7

2/3

1/3

P(xy)

0.5

0.2

0.2

0.1

I(xy) terms

0.126

-0.997

0.147

-0.085

Summary of basic formulae by Venn Diagram

H(x|y)

I(xy)

H(y|x)
H(y)

H(x)

Source entropy : H ( x ) p( x ) log p( x )

Receiver entropy : H ( y ) p( y ) log p( y )

Equivocation : H ( x | y ) p( xy ) log p( x | y )
H ( y | x ) p( xy ) log p( y | x )

Information transfer : I ( xy ) p( xy ) log [


H( x ) H( x | y )
H( y ) H( y | x )

p( x | y )
]
p( x ) p( y )

Quantity

Definition

Source information

I ( X i ) log 2 P( X i )

Received
information
Mutual information
Average mutual
information

I ( Y j ) log 2 P( Y j )

P( X i | Y j )

I ( X i ,Y j ) log 2

P( X i )

I ( X ,Y ) P( X i ,Y j ) log 2
X

P( X i ,Y j ) log 2
X

P( X i | Y j )
P( X i )

P( Y j | X i )
P( Y j )

P( X i ,Y j ) log 2
X

Source entropy

H ( X ) P( X i ) log 2 P( X i )

Destination entropy

H ( Y ) P( Y j ) log 2 P( Y j )

Equivocation

H ( X | Y )

P( X ,Y

Error entropy

H ( Y | X )

P( X ,Y

)log 2 P( X i | Y j )

) log 2 P( Y j | X i )

P( X i ,Y j )
P( X i )P( Y j )

Channel capacity
C=max I(xy)
that is maximum information transfer
Binary Symmetric Channels
The noise in the system is random, then the probabilities of
errors in 0 and 1 is the same. This is characterised by a
single value p of binary error probability.
Channel capacity of this channel
I ( xy ) H ( x ) H ( x | y )
H ( y ) p( xy ) log p(y|x)

H ( y ) p( x )
x

p( y | x ) log p(y|x)

Channel capacity of BSC channel


I ( xy ) H ( x ) H ( x | y )
H ( y ) p( xy ) log p(y|x)

H ( y ) p( x ) p( y | x ) log p(y|x)
x
y

H(y) ( plogp plogp )


H(y) H ( p )
where H ( p ) ( plogp plogp )

Mutual information increases as


error rate decreases

p
p(0)

x
(transmit)
p(1)=1-p(0)

p
1

y
(receive)

p 1 p

Vous aimerez peut-être aussi