Suppose we're trying to count to $N$ using only $\log_2(\log_2(N))$ bits. One way we could try to accomplish this is by keeping a counter $k$ and incrementing it at each step with probability $2^{-k}$. This gives rise to a sequence of random variables $X_0,\ldots,X_N$ where $X_i$ is the value of our counter after $i$ steps. Concretely: $$ X_0 := 0 \qquad\text{and}\qquad X_i := X_{i-1} + I_i $$ where $I_i \sim \text{Bern}\bigl(2^{-X_{i-1}}\bigr)$.
What I'd like to show is that with high probability (with probability $1 - o(1)$) we have $$ 2^{(1-\epsilon)X_N} < N < 2^{(1+\epsilon)X_N} $$ for all $\epsilon > 0$.
All I have so far is that (using the law of iterated expectation and induction)
$$
\mathbb{E}\bigl[2^{X_i}\bigr] = i + 1 \qquad\text{and}\qquad \text{Var}\bigl[2^{X_i}\bigr] = \frac{i(i-1)}{2}
$$
as well as some numerical evidence: if $\epsilon = 0.1$, then
shows that $\mathbb{P}\bigl[2^{(1-\epsilon)X_N} < N\bigr]$ appears to achieve high probability. And
suggests that $\mathbb{P}\bigl[2^{(1+\epsilon)X_N} < N\bigr]$ goes to $0$. Conversely, $\mathbb{P}\bigl[N < 2^{(1+\epsilon)X_N}\bigr]$ has high probability.