Entropy inequalities
Definition.
Let be Renyi entropy,
for probability density function
is Shannon entropy.
and .
and
.
We derive some properties by assuming
.
where is covariant matrix and is constant.
Theorem 1 ( joint entropy inequatliy)
Let be probability variables in and .
Let and be eigen values of .
If for all ,
This inequaity is the extension of the inequality in discrete case
.
Theorem 2 (d-dimentinal reverse EPI)
Let be uncorrelated probability variables in .
Let for all k, and let
be eigen values of .
If d = 1,
for .
If for all and ,
for .
These are the reverse EPI.
Theorem 3. (Renyi EPI for order p < 1)
Let be independent probability variables in .
Let for all k.
where, is covariant matrix.
Then,
for <,
This inequality is the extension of Renyi EPI for p<1.
where
Proposition 1.
This inequality holds either discrete case or continuous case.
We easily show these results by using .
Proposition 2.(entropy upper bound)
For discrete or continuous d-dimentional probability variable ,
.
Lemma 1.
For continuous probability variable,
if .
Proof of Theorem 1.
Using Proposition2 and Lemma1,
eq(1)
By the asuumption and ,
eq(2)
By combining Proposition 1., eq(1) and eq(2),
Using Jensen's Inequality,
.
We derive
Proof of Theorem 2.
For , using Proposition2.,
Using , and uncorrelated condition ,
we derive
By assumption ,
.
By combining Proposition 1. and assumption ,
.
Proof of Theorem 3.
Lemma 2.
For <,
We can derive this inequality from Proposition 1.3. in [4].
By Proposition 1.3. in [4],
maximise Renyi entropy.
Using and the definition of beta function,
we derive
where
.
For any probability variable with covariant matrix ,
.
Lemma 3.
Let Y be for independent probability variable .
This Lemma is shown as Theorem 2.14. in [1].
By combining the assumption and Lemma 2.,
.
By combining Lemma 3. and Proposition 1,
.
References.
[1] A.Marsiglietti, V.Kostina2, P.Xu. "A lower bound on the differential entropy of log-concave random vectors with applications"
[2] E.Ram, I.Sason. "On Renyi Entropy Power Inequalities".
[3] A. Marsiglietti, V.Kostina."A lower bound on the differential entropy of log-concave random vectors with applications"
[4]O.Johnson , C.Vignat ."Some results concerning maximum Rényi entropy distributions"