趣味の研究

趣味の数学を公開しています。初めての方はaboutをご覧ください。

The conjecture about information entropy

Conjecture about information entropy.

The proof may be easy in discrete case.

Consider the variable group :\{X_k\} and \{Y_l\}.

 

We assume the variables X can represent the variables Y_l as 

Y_l=f_l(X_1,X_2,\cdot\cdot\cdot)

Furthermore, the variables Y can represent the variables X_k as 

X_k=g_k(Y_1,Y_2,\cdot\cdot\cdot).

Then, the amount of information of \{X_k\} equals to the amount of information of \{Y_l\}

And

H(X_1,X_2,\cdot\cdot\cdot)=H(Y_1,Y_2,\cdot\cdot\cdot)

Here, "H" means joint entropy.

In continuous case, the left and right side may be diffrenent only Jacobian.

 

We  can caliculate the prime number density  using this conjecture.

We consider about the group of natural numbers less than N.

We regard nutural numbers have the uniform distribution.

The probability of choosing a natural number is

\frac{1}{N}.

The entropy of the natural numbers less than N is

H_N=\log(N).

\log means natural logarithm.

The natural numbers are uniquely represented by prime numbers and the prime numbers are represented by the natural numbers, too.

So we apply the entropy conjecture.

If we know a composite prime number, we can get the information of a natural number.

The probabilty a natural number is a multiple of p_k is \frac{1}{p_k}, and almost independent.

The joint entropy becomes the sum of entropy of prime numbers,

H(p_1,p_2,\cdot\cdot\cdot)=\sum_k \frac{1}{p_k}\log(p_k).

We apply the conjecture,

\log(N)=\sum_k \frac{1}{p_k}\log(p_k).

We appoximate the right side using the prime number density  function p_\pi(x) as 

\log(N)\sim \int_2^N \frac{p_\pi(x)}{x}\log(x)dx

We diffrentiate this equation with respect to  N, the result is

\frac{1}{N}\sim\frac{p_\pi(x)}{N}\log(N).

So, 

p_\pi(x)\sim \frac{1}{\log(x)}

 

We can caliculate the prime number density of Gauss integer in the same way.