Show that $\mathbb E[X_{n}]\xrightarrow{n \to \infty} \infty$ while $X_{n} \xrightarrow{n \to \infty} 0$ a.s.

93 Views Asked by At

Say I have a biased coin that shows heads with probability $p \in ]1/3,1/2[$ and I initially have capital of $100 $EUR. Every time heads is shown, my capital is doubled, in the other case I pay half of my capital. Let $X_{n}$ denote my capital after the $n$th flip.

$1.$ Show that $\lim_{n \to \infty}\mathbb E[X_{n}]=\infty$

$2.$ Show that $X_{n}\to 0$ a.s.

My idea on $1.$:

Let $R_{n}$ denote whether heads $(1)$ or tails $(2)$ is flipped on the $nth$ attempt. It follows that

$R_{n}$~$\operatorname{Ber}(p)$.

Note that $X_{0}=100$, and $X_{n+1}=100\prod_{i=1}^{n+1}(\frac{1}{2}+\frac{3}{2}R_{i})$

$\mathbb E[X_{n+1}]=100\mathbb E[\prod_{i=1}^{n+1}(\frac{1}{2}+\frac{3}{2}R_{i})]=100\prod_{i=1}^{n+1}\mathbb E[(\frac{1}{2}+\frac{3}{2}R_{i})]$

and then by law of expectation of discrete distributions:

$100(\mathbb E[(\frac{1}{2}+\frac{3}{2}R_{1})^{n}])=100[P(R_{1}=1)2^{n}+(\frac{1}{2})^{n}P(R_{1}=0)]=100[p2^{n}+(1-p)(\frac{1}{2})^{n}]\xrightarrow{n\to \infty}\infty$

since $p \neq 0$

Any tips on $2.$?

Another question, it seems very counterintuitive that if $P(X_{n} \to 0)=1$ then theres is still a chance that $\lim_{n \to \infty}\mathbb E[X_{n}]=\infty$, likewise as $\lim_{n \to \infty}\mathbb E[X_{n}]=\infty$ that then $P(X_{n} \to 0)=1$ is still a possibility, is there any intuitive explanation for this behaviour?

2

There are 2 best solutions below

1
On BEST ANSWER

Let $X_n$ be your wealth after $n$ flips, and let $Q_{n}=X_n/X_{n-1}$. Then $$ X_n=X_0\times Q_1\times Q_2\times\dots \times Q_n $$ Let $Y_n=\log X_n$. Then $$ Y_n=Y_0+\log Q_1+\log Q_2+\dots +\log Q_n $$ Note that $\log Q_i$ is an iid sequence of random variables. Find its mean, and then apply the strong law of large numbers, to conclude that $\sum_{i=1}^n Q_i\to-\infty$ almost surely. This shows $Y_n\to-\infty$ a.s, so that $X_n=\exp(Y_n)\to 0$ a.s.


I agree this is counter-intuitive. What is happening here is that the distribution of $X_n$ is getting progressively more right skewed; it has a positive tail which has a low probability, but is far enough out to have a high expectation. $X_n$ for large $n$ is like a lottery; most of the time you lose most of your $\$100$, but with a small probability you win huge. Unlike a real-life lottery, the balance of probability is such that $E[X_n]$ is large.

0
On

Let $p$ as you said, $q = 1-p.$ Write $R_n \sim p \varepsilon_0 + q \varepsilon_1$ (assumed i.i.d.), $X_0 = 100$ and $X_{n+1}=2X_n\mathbf{1}_{\{R_{n+1}=0\}}+\dfrac{1}{2}X_n\mathbf{1}_{\{R_{n+1}=1\}}.$ Then $E(X_{n+1})=2pE(X_n)+\frac{q}{2}E(X_n) = (2p+\frac{q}{2})E(X_n)$ and therefore $E(X_n) = (2p+\frac{q}{2})^n 100.$

Now, consider $\mathbf{1}_{\{R_n = 1\}}-\mathbf{1}_{\{R_n=0\}}$ which are random variables with mean $q-p> 0$ and the strong law of large numbers implies that their series diverges almost surely, that is to say, for almost every realisation and for every $K > 0,$ there will be eventually $K$ more $R_n = 1$ than $R_n = 0.$ Bearing this in mind it follows that $\bigcap\limits_{\alpha = \beta}^\infty \{X_\alpha > \delta\}$ is a null event, for whatever $\delta > 0$ may be, and therefore, so are all the following: $$\left\{\lim\cdot\inf X_n >0 \right\} \subset \bigcup_{n=1}^\infty \left\{\lim\cdot\inf X_k > \dfrac{1}{n} \right\} \subset \bigcup_{n=1}^\infty \bigcup_{\beta=2n}^\infty \bigcap_{\alpha = \beta}^\infty \{X_\alpha > \dfrac{1}{n} - \dfrac{1}{\beta}\},$$ hence $X_n \to 0$ almost surely. Q.E.D.