In answering Expected value of the number of bills, I came across a phenomenon the likes of which I don't think I've encountered before, and I'd like to know more about it.
You draw coins, each coin independently being a $1$€ coin or a $2$€ coin with equal probability. Obviously you'd expect to draw as many $2$€ coins as $1$€ coins. In particular, the expectation of $A-B$, where $A$ is the number of $1$€ coins drawn and $B$ is the number of $2$€ coins drawn, is $0$ after any given number of draws.
However, conditional on reaching a total value of $n$ euros, the expectation of $A-B$ tends to $\frac13$ for $n\to\infty$ (in fact, it's positive for all $n\gt2$): The probability to reach $n$ euros with $k$ $2$€ coins and $n-2k$ $1$€ coins is $\binom{n-k}k2^{k-n}$, so the expectation of $B$ is
\begin{eqnarray*} &&\frac{\sum_{k=0}^n\binom{n-k}k2^{k-n}k}{\sum_{k=0}^n\binom{n-k}k2^{k-n}}=\frac{\frac2{27}(3n-1)+O\left(2^{-n}\right)}{\frac23+O\left(2^{-n}\right)}=\frac n3-\frac19+O\left(2^{-n}\right)\;,\\ \end{eqnarray*}
the expectation of $A=n-2B$ is $\frac n3+\frac29+O\left(2^{-n}\right)$, and the expectation of $A-B$ is $\frac13+O\left(2^{-n}\right)$.
This is rather counterintuitive (to me): For any given number of coins the expectation is $0$, but for any given value of the coins it's positive. This is the stuff that paradoxes are made of if you're not careful how you talk about it, e.g.: “Someone is playing this game. What value do you expect for $A-B$?” – “$0$.” – “So far they drew $137$€. Now what do you expect?” – “$\frac13$.”
The resolution here is (as it often is) that the conditions aren't properly defined – we don't know why and when the person is telling us this amount. If they fixed a number of draws to wait for and then told us the total value at that point, the correct answer would still be $0$; if they fixed a total value to wait for and then told us when it was reached, the correct answer would be $\frac13$, but then the paradox of changing our mind just because we were told some number, no matter which one, wouldn't arise, because it's the stopping protocol that makes the difference.
Still, a certain uneasy sense of paradox remains, even if it temporarily retreats under the glare of careful analysis.
I don't have any concrete questions about this, but I'd be interested to hear about any other cases where such a phenomenon occurs, or names by which it's known, or approaches to deal with it, and perhaps also to ease that lingering sense of paradox.


I am not sure how rigorous my math is, but here is an argument for:
The proof is a mixture of steady state arguments and martingales.
Suppose while you are drawing coins, happily approaching $n$, I am betting on the coins you draw. To minimize confusion, I am betting in US dollars. Every time you draw, I bet $1$ USD, at even odds, that you would draw a $1$€ coin. At any point in time my profit is exactly $P=A-B$ USD.
Here is the stopping rule: my game ends when you get to $n$ or beyond (i.e. $n+1$). The stopping time is bounded, so Doob's theorem applies and we have $E[P] = 0$ when my game stops.
Now, my game can end in one of $3$ ways, and by steady state arguments, each is equally likely for large $n$:
(X) The last step was $n-2 \to n$
(Y) The last step was $n-1 \to n$
(Z) The last step was $n-1 \to n+1$
Curiously, in $2$ of the $3$ cases, I lost $1$ USD on that last bet. By the law of total expectation:
$$ E[P] = \frac13 (E[P \mid X] + E[P \mid Y] + E[P \mid Z])$$
By definition of $F_n$ and explicit accounting of the last win/loss, this becomes:
$$0 = (E[P \mid F_{n-2}] - 1) + (E[P \mid F_{n-1}] + 1) + (E[P \mid F_{n-1}] - 1)$$
So if the limit exists, we have:
$$0 = (\ell -1) + (\ell + 1) + (\ell - 1) \implies \ell = \frac13 ~~~~~\square$$
As I mentioned in the very beginning, I am not sure about the rigor of the argument. Critiques, corrections, comments are most welcome.