The conjecture about information entropy
Conjecture about information entropy.
The proof may be easy in discrete case.
Consider the variable group : and .
We assume the variables can represent the variables as
Furthermore, the variables can represent the variables as
.
Then, the amount of information of equals to the amount of information of .
And
Here, "H" means joint entropy.
In continuous case, the left and right side may be diffrenent only Jacobian.
We can caliculate the prime number density using this conjecture.
We consider about the group of natural numbers less than N.
We regard nutural numbers have the uniform distribution.
The probability of choosing a natural number is
.
The entropy of the natural numbers less than N is
.
means natural logarithm.
The natural numbers are uniquely represented by prime numbers and the prime numbers are represented by the natural numbers, too.
So we apply the entropy conjecture.
If we know a composite prime number, we can get the information of a natural number.
The probabilty a natural number is a multiple of is , and almost independent.
The joint entropy becomes the sum of entropy of prime numbers,
.
We apply the conjecture,
.
We appoximate the right side using the prime number density function as
We diffrentiate this equation with respect to N, the result is
.
So,
We can caliculate the prime number density of Gauss integer in the same way.