Let $X_{n}$ be a random variable representing money that we have after n-th toss of a coin and $Y_{n}$ be random variable with Bernoulli distribution with probability of success (heads) $p \in ( \frac{1}{3}, \frac{1}{2} )$. We play a game in which if we get the double of our money if we see heads and we lose half of our money if we see tails. Then: $$ X_{n+1} = 2X_{n} Y_{n+1} + \frac{1}{2} (1- Y_{n+1})X_{n} = ( \frac{1}{2} + \frac{3}{2} Y_{n+1} ) X_{n} $$
Let $X_{0} = 100$. Then we see that $EX_{n} = 100 ( \frac{1}{2} + \frac{3}{2} p)^n$, so the expected value goes to infinity with $n$. Then we want to show that despite that $$ P( \{ \omega \in \Omega : \ \lim_{n \to \infty } X_{n} ( \omega ) = 0 \} ) = 1, \text{ i. e.} \lim_{n \to \infty} X_{n} = 0 \ \text{a.s.}$$
Let $W_{n}$ represent the number of times we have seen heads. Then $W_{n} = \sum_{i=1}^{n} Y_{i} \Rightarrow X_{n} = 100 \ 2^{ 2 n \overline{Y}_{n} - n }$. Let $ \varepsilon >0$ and then we see that $$ X_{n} > \varepsilon \iff 100 \ 2^{ 2 n \overline{Y}_{n} - n } > \varepsilon \iff e^{(2n \overline{Y}_{n} -n) \log 2)} > e^{ \log \frac{ \varepsilon}{100} } \iff 2n \overline{Y}_{n} -n > \frac{ \log \frac{ \varepsilon}{100}}{\log 2} \iff \overline{Y}_{n} > \frac{ \log \frac{ \varepsilon}{100}}{ 2 n \log 2} + \frac{1}{2} $$
Then because $p< \frac{1}{2}$ we have $ \overline{Y}_{n} > \frac{ \log \frac{ \varepsilon}{100}}{ 2 n \log 2} + p$, so $ P(X_{n} > \varepsilon ) = P( \overline{Y}_{n} > \frac{ \log \frac{ \varepsilon}{100}}{ 2 n \log 2} + p)$. I think now I should use some form of Chebyshev inequality ,but I'm sort of stuck there.