From Mézard & Montanari$^\color{red}{\star}$
Consider a gambler who bets on a sequence of bernouilli random variables $X_t \in \left\{ 0, 1 \right\}, t \in \left\{0, 1, 2, \dots \right\}$ with mean $\mathbb{E}X_t = p$. Imagine he knows the distribution of the $X_t$’s and, at time $t$ he bets a fraction $w(1) = p$ of his money on $1$ and a fraction $w(0) = (1−p)$ on $0$. He loses whatever is put on the wrong number, while he doubles whatever has been put on the right one. Define the average doubling rate of his wealth at time $t$ as $$W_t=\frac{1}{t}\mathbb{E}\log_2\left\{ \prod_{t'=1}^{t} 2w(X_{t'}) \right\}$$ It is easy to prove that the expected doubling rate $\mathbb{E}W_t$ is related to the entropy of $X_t: \mathbb{E}W_t = 1 − \mathcal{H}(p)$. In other words, it is easier to make money out of predictable events.
$\color{red}{\star}$ Mézard, Marc; Montanari, Andrea, Information, physics and computation, Oxford Graduate Texts. Oxford: Oxford University Press (ISBN 978-0-19-857083-7/hbk). xiii, 569 p. (2009). ZBL1163.94001.
The definition of 'average doubling rate' seems indecipherable. I wonder if there is a way to interpret this term specifically?
If the gambler starts with $1$ unit of money, then $\prod_{t'=1}^t 2w(X_{t'})$ is the amount of money he has after $t$ turns.
The base-2 logarithm of this quantity can be interpreted as the number of times his money has doubled, relative to the beginning. E.g. if he ends up with $16$ units of money after turn $t$, that means his money doubled $4$ times relative to the start. Dividing by $t$ gives a notion of "how much doubling per unit of time on average." Finally, taking the expectation applies an average with respect to the randomness of the outcomes.