Entropy of a natural number

372 Views Asked by At

Let $H(n) = -\sum_{d|n} \frac{d}{\sigma(n)} \log(\frac{d}{\sigma(n)}) = \log(\sigma(n))-\frac{1}{\sigma(n)}\sum_{d|n} d\log(d)$ be one entropy defined on natural numbers. Then I can prove that $H$ is additive ( $\gcd(n,m) = 1 \rightarrow H(mn) = H(n)+H(m))$

It occured to me numerically that $\lim_{\alpha \rightarrow \infty} H(p^\alpha)$ always exists where $p$ is a prime. However I am not able to prove it, so maybe someone has an idea on how to prove it.

Edit:

We have $H(p^\alpha) = \log(1+p+p^2+\cdots+p^\alpha)-\frac{p\log(p)(1+2p+3p^2+\cdots +\alpha p^{\alpha-1})}{1+p+p^2+\cdots+p^\alpha}$

3

There are 3 best solutions below

5
On BEST ANSWER

Recall the relation $1 + p + \cdots + p^{\alpha} = \frac{p^{\alpha+1}-1}{p-1}$. Regarding $p$ formally as a variable independent of $\alpha$ and taking logarithmic differentiation w.r.t. $p$ to both sides, we have

$$ \frac{1 + 2p + \cdots + \alpha p^{\alpha-1}}{1 + p + \cdots + p^{\alpha}} = \frac{(\alpha+1)p^{\alpha}}{p^{\alpha+1}-1} - \frac{1}{p-1}. $$

So your expression for $H(p^{\alpha})$ can be simplified as

$$ H(p^{\alpha}) = \color{green}{\log\left(\frac{p^{\alpha+1}-1}{p-1}\right)} - \color{blue}{(p \log p) \left( \frac{(\alpha+1)p^{\alpha}}{p^{\alpha+1}-1} - \frac{1}{p-1} \right)}. $$

Since both the green term and the blue term diverge separately, we need a bit of computation to cancel out diverging parts. To this end, we expand both colored terms as follows:

\begin{align*} \color{green}{\log\left(\frac{p^{\alpha+1}-1}{p-1}\right)} &= \color{green}{(\alpha+1) \log p + \log\left(\frac{1-p^{-(\alpha+1)}}{p-1}\right),} \\ \color{blue}{-(p \log p) \left( \frac{(\alpha+1)p^{\alpha}}{p^{\alpha+1}-1} - \frac{1}{p-1} \right)} &= \color{blue}{-\frac{(\alpha+1)p^{\alpha+1}\log p}{p^{\alpha+1}-1} + \frac{p \log p}{p-1}.} \end{align*}

Plugging this back and grouping terms having factor $(\alpha+1)\log p$, we obtain

\begin{align*} H(p^{\alpha}) = \color{red}{- \frac{(\alpha+1)\log p}{p^{\alpha+1}-1}} + \color{green}{\log\left(\frac{1-p^{-(\alpha+1)}}{p-1}\right)} + \color{blue}{\frac{p \log p}{p-1}}, \end{align*}

where the red term is the result of grouping. Finally, taking the limit as $\alpha \to \infty$, we get

$$ \lim_{\alpha \to \infty} H(p^{\alpha}) = \frac{p\log p}{p-1} - \log(p-1). $$

1
On

By well-known summation tricks,

$$H(p^{\alpha-1})=\log\frac{p^\alpha-1}{p-1}-p\log p\frac d{dp}\left(\log\frac{p^\alpha-1}{p-1}\right).$$ (For comfort, we used $\alpha-1$ insteand of $\alpha$.)

Then $$H(p^{\alpha-1})=\log(p^\alpha-1)-\log(p-1)-p\log p\left(\alpha\frac{p^\alpha}{p^\alpha-1}-\frac1{p-1}\right)\\ =\alpha p+\log\left(1-p^{-\alpha}\right)-\log(p-1)-p\log p\left(\alpha\frac1p\frac1{1-p^{-\alpha}}-\frac1{p-1}\right).$$

For large $\alpha$, this can be written

$$\alpha\log p-p^{-\alpha}+\cdots-\log(p-1)-p\log p\left(\alpha\frac1p\left(1+p^{-\alpha}+\cdots\right)-\frac1{p-1}\right)$$ using Taylor's developments.

After simplification, this tends to $$-\log(p-1)+\frac{p\log p}{p-1}.$$

1
On

I just want to point out that not only does $H(p^\alpha)$ converge, but the distribution implied by the way you define entropy converges to a limiting distribution, provided we index each divisor $d=p^k$ by the exponent $\alpha-k$ of its cofactor $n/d$ (i.e. sort the probabilities in descending order).

Then it is easy to see that the limiting distribution is $$P(X=n) = (1 - p^{-1}) p^{-n}, n \ge 0,$$ which is a geometric distribution with parameter $1 - 1/p$.

By continuity of entropy, the limit of the entropies converges to the entropy of this distribution. This yields a (very slightly) cleaner expression since we don't need to carry the finitary remainder terms, but we still need to use the infinite series $\sum_n nx^{n-1} = (1-x)^{-2}$. Or one could just look up the known entropy of a geometric distribution from the Wikipedia entry...