In my previous question Infinite game of an unfair coin toss, I showed that in a game between Person A and B, where:
Person A has an unfair coin with probability $p \in (\frac{1}{3},\frac{1}{2})$ of heads. Person B starts with a captial of 100€. For each time B tosses heads, his captial doubles and is halved if B tosses tails. Each time A gives or gets the money.
So let $X_n :=$ B's capital after the n'th toss. That $\lim_\limits{n \rightarrow \infty} \mathbb{E}[X_n]=\infty$. Now the follow up to this is, to show that: $$P(\{\omega \in \Omega: \lim_\limits{n \rightarrow \infty} X_n=0 \})=1$$
To show the first statement I used, that: $$X_n=X_{n-1}(1+W_n), W_n:= \left\{ \begin{array}{ll} 1 & \textrm{B tosses heads}\\ -\frac{1}{2} & \, \textrm{otherwise} \\ \end{array} \right. $$
For this statement, we were given the hint:
Apply the strong law of large numbers, which will be formulated in the following, to $Z_i:=\log(1+W_i), i\in \mathbb{N}$. Let $(Z_i)i \in \mathbb{N}$ be a sequence of pairwise uncorrelated random variables defined on a probability space $(\Omega, \mathcal{A}, P)$ with existing expected values and finite variances $\sup_\limits{i∈N} V(Z_i) < \infty$. Then:$$P(\{\omega \in \Omega: \lim_\limits{n \rightarrow \infty}\frac{1}{n}\sum_{i=1}(Z_i-\mathbb{E}[Z_i])=0\})=1.$$
My attempt: First of all, I tried to show the requirements, to use the result: $$P(Z_i=k,Z_j=l)= \left\{ \begin{array}{ll} p(1-p) = P(Z_i=\log(2))P(Z_j=-\log(2))\\ p^2 = \ P(Z_i=\log(2))P(Z_j=\log(2)) \\ (1-p)^2=P(Z_i=-\log(2))P(Z_j=-\log(2))\\ \end{array} \right.,\quad k,l \in \{\log(2),-\log(2)\}$$ So $(Z_i)_{i \in \mathbb{N}}$ are pairwise independent, thus pairwise uncorrelated.
Then, $\mathbb{E}[Z]:=\mathbb{E}[Z_i]=p\log(2)-(1-p)\log(2)$, so the expected value exists.
$$\sup_\limits{i∈N} V(Z_i)=\sup_\limits{i∈N} \sum_{z \in Z_i(\Omega)}(z-\mathbb{E}[Z_i])^2P(Z_i=z)=\sup_\limits{i∈N} \sum_{z \in \{\log(2),-\log(2)\}}(z-\mathbb{E}[Z_i])^2P(Z_i=z)=\\ \sup_\limits{i∈N}\quad (\log(2)-(2p-1)\log(2))^2+(-\log(2)-(2p-1)\log(2))^2(1-p) < \infty$$
So now that I can use the statement, from the hint, I thought it may be a good idea to rewrite so it:
$$P(\{\omega \in \Omega: \lim_\limits{n \rightarrow \infty}\frac{1}{n}\sum_{i=1}^n(Z_i-\mathbb{E}[Z_i])=0\})=P(\{\omega \in \Omega: \lim_\limits{n \rightarrow \infty}\frac{1}{n}\sum_{i=1}^nZ_i=\mathbb{E}[Z]\})=1.$$
Question:But, I don't really see, how this could help me, to show the inital statement. I think I could somehow use $\log(1+x)<x$, due to the definition of $Z_i$. But other than that, I am really stuck on this problem. I would be glad to get some further hints, or maybe an idea how to use the provided hint.
The definitions of $X_n,W_n$ and $Z_n$ imply $$ \begin{align} X_n&=X_0\cdot (1+W_1)\cdot (1+W_2)\cdots (1+W_n) \\&= X_0\cdot e^{Z_1+\dots+Z_n}. \end{align} $$ Therefore, $$ \log(X_n/X_0)=Z_1+\dots+Z_n. $$ Let $\mu=\mathbb E[Z_1]$, and recall that $\mu<0$. We can rewrite the above as $$ \frac1n\log(X_n/X_0)=\mu + \color{gray}{\frac1n\sum_{i=1}^n (Z_i-\mu)}. $$ Now, using the Strong Law of Large Numbers, the gray term approaches $0$ as $n\to\infty$, with probability $1$. This means that, for all $\epsilon>0$, we have $$ \frac1n\log(X_n/X_0)<\mu + \epsilon\qquad \text{when $n$ large enough, with probability 1} $$ By choosing $\epsilon$ small enough, we can ensure $\mu+\epsilon$ is negative. This means that $$ X_n\le X_0\cdot (e^{\mu+\epsilon})^n\qquad\text{when $n$ is large enough, with probability 1} $$ Since $\lim_n (e^{\mu+\epsilon})^n=0$, this implies that $\lim_n X_n=0$ with probability $1$.