Académique Documents
Professionnel Documents
Culture Documents
Telecommunication Performance
Tx Rx
Let assume
X є { X1, X2, X3, X4, X5, X6, X7, X8}
P(X) є {1/2 , ¼ , 1/8 , 1/16, 1/64, 1/64 , 1/64 , 1/64}
What The entropy of this source ? The information of each message.
DEFINITION OF INFORMATION