Académique Documents
Professionnel Documents
Culture Documents
Abstract
Key Words: Useful entropy, useful mean code word length, coding
theorem.
INTRODUCTION
Consider the following model for a random experiment S ,
S N E; P;U (1.1)
we call (1.1) a utility information scheme. Belis and Guiasu [1] introduced the
measure
H P;U u i pi log pi
(1.2)
about the scheme (1.1). They called it useful information for this scheme,
where H P;U reduces to Shannons [10] entropy when u i 1 i.e., the utility
aspect of the scheme is ignored for each i. Unless otherwise stated will
N
stand for and the logarithms are to the base D(D>1) throughout the paper.
i 1
Guiasu and Picard [4] considered the problem of encoding the outcomes in
(1.1) by mean of a prefix code with code words w1 , w2 ,....wN having lengths
respectively n1 , n2 ,....n N and satisfying Krafts inequality [3]
ni
D 1 (1.3)
where D is the size of code alphabate. They introduced the following useful
mean length of the code
LU
u p n
i i i
(1.4)
u p i i
2. Coding Theorem
Consider a function
1
1 u i pi
1
H , P;U D 1 1 (2.1)
u i pi
Further consider
1
1
1 1ni
ui
L , U D 1 1 p i D (2.2)
u p
i i
(i) For 1, the length (2.2) reduces to the useful mean length of the code
given by Bhatia[2].
(ii) For 1 and 1 , the length (2.2) reduces to the useful mean length
LU of the code given in (1.4).
(iii)When the utility concept of the scheme is ignored by taking u i 1 for
each i, pi 1, 1 and 1 , the mean length becomes optimal code
length defined by Shannon [10].
THEOREM 2.1:- For all integers D>1, let ni satisfy (1.3), then the generalized
average useful code word length satisfies
L , U H , P;U (2.3)
u p
ni log i i (2.4)
u p
i i
Proof:-
By Holders inequality [10]
1 1
x y x y
i i i
p p q
i
q
(2.5)
50 P. Jha and Anjali Chandravanshi
1 1
For all xi , yi 0, i 1,2.....N and 1, p 1 0, q 0 or
p q
q 1 0, p 0.
We see that equality holds if and only if there exists a positive constant c
such that
xip cy iq (2.6)
in (2.6), using (1.3) and after making suitable operations we get (2.4) for
1
D 1 0 according as 1.
It is clear that the equality in (2.4) holds if and only if
ni u i p i
D
upi
or
u p
ni log i i
u p
i i
THEOREM 2.2:- For every code with length ni , i 1,2,...N , of theorem 2.1,
L , U can be made to satisfy
1 1
1 1
H , P;U L , U H , P;U D
D 1 1 D (2.7)
or
1 1
u i pi ni
u p
D D i i (2.8)
u i pi u i pi
or
1
1
1
ni
u i pi
D D (2.9)
u i pi
1
ui
Multiplying both sides of (2.9) by pi summing over i 1, 2,.... N
u i p i
and making suitable operation we get ,
1 1
1 1
L , U H , P;U D
D 1 1 D (2.10)
References
[1] M.Belis and S.Guiasu, A qualitative-quantitative measure of information
in Cyberna-tics Systems, IEEE Trans. Information Theory, IT-14(1968),
593-594.
[2] P.K.Bhatia, On generalized useful inaccuracy for incomplete probability
distribution, Soochow J. Math., 25(2) (1999), 131-135.
[3] A.Feinstein, Foundation of Information Theory, McGraw HILL, New
York,(1958).
52 P. Jha and Anjali Chandravanshi