Inequality of information entropy, gamma, beta function, binomal coefficients
We think of cross entropy for probability density function P(x), Q(x).
Here, is natural loggarithm.
1) Theorem
Define
and assume is integrable.
If s>0,
and
.
It is equivalent
eq(1)
for <0 <
Furthermore,
then,
.
If P(x)=Q(x), we define H(X) as entropy
for <0 <
2)Corollary
Corollary 1.
for x,y>0. is digamma function.
Especially x is integer n,
Here, H_n is harmonic number
Corollary 2.
for s>-1
Corollary 3.
for s>-1
Proof of Theorem
We apply Jensen's inequality, we derive
We take loggarithm of both sides, and devide by , we get the result.
Remark the inequality is reversed in accordance with the sign of .
equals 0 if s=0.
The limit as , equals the diffrential of with respect to .
We diffrentiate with respect to and substitute , the formula equals .
From eq(1),
for s>-1
eq(2)
Here, H(x) is entropy.
We derive some inequalities by using eq(2).
Proof of corollary 1
We apply eq(2) to gamma distribution.
For x>0, ,
.
For >0, we caliculate ,
Transform ,
We substitute to eq(2),
for s >-1.
Here, is digamma function.
We put , , then
.
We derive
for x,y>0.
Especially x is integer n,
Here, H_n is harmonic number
Next, we apply eq(2) to Rayleigh distribution.
Substitute , and transform
Integrate fom 0 to , and substitute to eq(2), we can derive
.
We put , then
for <x
Proof of corollary 2
We apply eq(2) to binomal distribution, and we put
We substitute to eq(2),
for s>-1
Proof of corollary 3
We apply eq(2) to beta distribution.
Integrate from 0 to 1,
We substitute to eq(2),
for s>-1