This actually came about from a different question: "A random natural number $0 \leq n \leq 999$ is chosen and written down in base 10 by person A (including leading zeros, if $n<100$). Person B then writes down a three-digit number $m$ (also including any necessary leading zeroes) and sees how many, if any, of the digits match $n$. What is the associated entropy of the set of matches?"
We can calculate these probabilities fairly easily. There is a .001 chance that $m=n$. There is then a .009 chance that the first two digits match but the third does not, a .009 chance for the second two match but the first does not, and a .009 chance for the first and last to match but not the middle; in total, there is a .027 chance that two digits match. We can use a similar argument to see that there is a .243 probability that one digit matches, and a .729 probability that no digits match. The entropy function in its general form (when measured in bits) is equal to $H=\Sigma_i^np_i\log_2 (\frac{1}{p_i})$, where $p_i$ is a the probability that a certain outcome occurs. Thus, our entropy is
- $\frac{1}{1000}\log_2{(1000)}+\frac{27}{1000}\log_2{(\frac{1000}{9})}+\frac{243}{1000}\log_2{(\frac{1000}{81})}+\frac{729}{1000}\log_2{(\frac{1000}{243})}\approx 1.407$
One quickly sees that this can be generalized for $10^b$ with the following formula
$$H(X_b)=\Sigma_{n=0}^b {b \choose n}\left(\frac{9^n}{10^b}\right)\log_2{\left(\frac{10^b}{9^n}\right)}$$
Here's where things get interesting: I was looking at this through Desmos, and I found the following relation.
$\frac{\log_2{(10^b)}}{H(X_b)} = 7.083068882...$
This seems to hold for natural $b$ - there is some small variance from this value when $b\in \{\mathbb{R^+}\}/\{\mathbb{N}\}$, but I am more astonished at this constant appearing out of the blue. I can't find anything on OEIS suggesting that this is a known constant. Where did it come from, and what is its exact form?
More generallly, if you have a coin with probability of heads $p$ and you flip it $b$ times, the entropy of the locations of the heads is $$-\sum \binom bn p^n(1-p)^{b-n}\log\left(p^n(1-p)^{b-n}\right)\\=-\log(p)\sum_n n\binom bnp^n(1-p)^{b-n}\\-\log(1-p)\sum(b-n)\binom bn p^n(1-p)^{b-n}$$ Then $$n\binom bn=b\binom{b-1}{n-1}\\(b-n)\binom nb=b\binom{b-1}{n}.$$ So you’d get an entropy of $$-b(p+(1-p))^{b-1}\log p +(-b)(p+(1-p))^{b-1}\log(1-p)\\=-b (p\log p+(1-p)\log (1-p)),$$ or $b$ times the entropy of a single toss of the coin.
In your case, $p=\frac1{10},$ so $$H(X_b)=b\left(\frac1{10}\log(10)+\frac{9}{10}\log\left(\frac{10}9\right)\right)\\=b\left(\log(10)-\frac{9}{10}\log9\right)$$
Then $$\frac{\log(10^b)}{H(X_b)}=\frac{\log(10)}{\log 10-\frac9{10}\log 9}=\frac{1}{1-\frac{9}{10}\log_{10}9}$$
This is a special case of the additive nature of entropy. If you have two independent random events with probabilities $p_i$ and $q_j$ then the entropy of both events is the sum of the entropies of the individual events.
$$\begin{align} -\sum_{ij} p_iq_j\log (p_iq_j)&=-\sum_{ij}q_{j}p_i\log p_i-\sum_{ij}p_iq_j\log q_j\\&=-\left(\sum_j q_j\right)\sum_i p_i\log p_i-\left(\sum_i p_i\right)\sum_j q_j\log q_j\\&=-\sum_i p_i\log p_i-\sum_j q_j\log q_j.\end{align}$$