Proof to go broke almost surely

111 Views Asked by At

Suppose you play a game where you start with capital $K_0 = 1$. In each turn $i = 1,2, \dots, n$ you throw a fair coin independently of the history. If you get "heads", you get back $3/2$ of your capital, if you get "tails" you only get back $1/2$ of your capital.

Prove that $K_n \to 0$ almost surely.

My attempt:
First define $R_i$ by $$R_i(\omega) = \begin{cases} \frac{3}{2} & \omega_i = \text{heads} \\ \frac{1}{2} & \omega_i = \text{tails}\end{cases}$$ and note that $K_n = \prod_{i=1}^nR_i $, $\mathbb E[R_i] = 1 $ and by independence $\mathbb E[K_n] = 1.$ Now the only ways to prove almost sure convergence I know of are the Strong Law of Large Numbers and the Borel-Cantelli Lemma. A way to apply the former is to look at $$\log K_n = \sum \log(R_i).$$ By the Strong Law I get $$\frac{1}{n}\sum_{i=1}^n \log(R_i) \to \log \left(\frac{\sqrt3}{2}\right)$$ but that is not quite what I want. Also I find this pretty counterintuitive.

2

There are 2 best solutions below

1
On BEST ANSWER

So, firstly I will explain the error in your approach via the strong law, and the correct way in which the Strong Law does suggest this to be true.

However, I think you still need to be careful with this approach, so then present an argument via Martingales.

Intuition via the strong law

Your intuition to take logarithms, show almost sure convergence of these, and then exponentiate again is reasonable. This implicitly relies on the Continuous Mapping Theorem, which says under certain conditions on a function $g$ then

$$ X_n \xrightarrow{a.s} X \, \Rightarrow \, g(X_n) \xrightarrow{a.s.} g(X).$$

So in this context the sequence $X_n = \sum_{k=1}^n \log R_k$, and $g(x) = \exp(x)$.

The strong law requires us to divide $X_n$ by $n$ to be able to make a statement about the limit. i.e. the strong law states $$ \frac1n \sum_{k=1}^n \log R_k \xrightarrow{a.s.} \log \frac{\sqrt{3}}{2}$$

But now when we apply the continuous mapping theorem this gives $$ \exp \left( \frac1n \sum_{k=1}^n \log R_k \right) \xrightarrow{a.s.}\frac{\sqrt{3}}{2}$$

At this point the rest of my proof becomes heuristic: and I think you need to do some serious thinking as to whether it can be justified. If we take powers of $n$ on both sides then we have in some heuristic sense $$ K_n = \exp \left( \sum_{k=1}^n \log R_k \right) \xrightarrow{a.s.}\left(\frac{\sqrt{3}}{2}\right)^n \rightarrow 0,$$ since $\sqrt{3} < 2$. However this is not rigorous!

Using Doob's Convergence Theorem

An alternate, rigorous approach is to first show that $K_n$ is a Martingale, and then apply the Martingale convergence theorem.

You have effectively already shown that $K_n$ is a martingale since (without getting into the background measure theory):

$$\mathbf{E}[|K_n|] = 1 < \infty$$

\begin{align*} \mathbf{E}[K_n \, | \, K_1,\ldots, K_{n-1}] & = \mathbf{E}[R_{n}K_{n-1} \, |\, K_{n-1}]\\ & = \mathbf{E}[R_n] K_{n-1} \\ & = K_{n-1} \end{align*}

A particularly simple version of the martingale convergence theorem asserts that if $K_{n}$ is a positive martingale, then it converges almost surely to some $K$: $$K_n \xrightarrow{a.s.} K.$$ So we have now established that $K_n$ does converge almost surely to something, it remains to show that $K \equiv 0$.

In an analogy to real analysis: if an infinite product is known to converge (almost surely) to a finite limit, then either the individual terms of the product must converge to $1$, or the limit must be $0$.

Clearly the $R_n$ do not converge almost surely to $1$, hence $K \equiv 0$.

0
On

Formal answer. One way to do this (without martingales) is by using strong law of large numbers and the law of the iterated logarithm in tandem. You are flipping a fair coin $n$ times. Let $H_i$ equal $1$ if the coin comes up heads on flip $i$, and $0$ otherwise. Then the $H_i$ are i.i.d. uniform on $\{0, 1\}$, and $S_n = H_1 + ... + H_n$ satisfies:

  1. Strong law of large numbers: $S_n/n \rightarrow 1/2$ almost surely.

  2. Law of the iterated logarithm: $\limsup_{n \to \infty} \frac{\pm (S_n - n/2)}{\sqrt{2n \log\log n}} = 1/2$ almost surely. (This is because $H_i - 1/2$ has mean $0$ and variance $1/4$ for all $i \geq 1$.)

If we denote your capital after bet $n$ by $K_n$, and your starting capital is $K_0 = 1$, this strategy amounts to betting half your wealth every time. We have the expression $$K_n = K_0 \prod_{i=1}^n \left( \frac{1}{2} + H_i \right) = \frac{3^{S_n}}{2^n},$$

since the denominator is always multiplied by $2$, and the numerator is multiplied by $1$ for every tails and $3$ for every heads. Therefore, we must almost surely have $$\limsup_{n \to \infty} K_n = \limsup_{n \to \infty} \frac{3^{S_n}}{2^n} \leq \limsup_{n \to \infty} \frac{3^{n/2 + \sqrt{2n \log \log n}/2}}{2^n} = \limsup_{n \to \infty} \frac{(\sqrt{3})^{(n + \sqrt{2n \log \log n})}}{2^n}= 0,$$

which shows that your wealth converges to zero with probability one.

Informal answer. Even though you are equally likely to win as to lose this bet on any given round, a loss divides your wealth by 2, while a win only multiplies it by 1.5. This means it takes, on average, $$\log_{1.5}(2) = 1.70951129135...$$ wins to cancel out a loss. But the SLLN shows that wins and losses happen with equal frequency asymptotically. Thus, $$K_n \approx K_0 \left( \frac{1}{2} \right)^{n/2} \left( \frac{3}{2} \right)^{n/2} = \left( \frac{\sqrt{3}}{2} \right)^{n},$$

and the fluctuations in the exponent are only of order $\sqrt{n \log \log n}$ by LIL, which is not enough to cancel out the downward exponential decay in the limit as $n \rightarrow \infty$.

This also shows that any strategy of the form "On bet $n$, stake $p \%$ of your wealth for fixed $p \in (0, 1)$" must eventually give you a limiting wealth of $0$, because $(1 - p)^{1/2}(1 + p)^{1/2} = (1 - p^2)^{1/2} < 1$, so asymptotically your wealth decays exponentially to zero.