Gambling Game: Martingales

915 Views Asked by At

This is a multipart question; if there's a strong preference for breaking this into separate questions I'll do that.

Imagine a game between a gambler and a croupier.

Total capital in the game is $1$.

After $n^{\text{th}}$ hand of the game, the fraction of capital held by the gambler is $X_{n}\in[0,1]$, and capital held by croupier is $1-X_{n}$.

Assume $X_{0}=p\in(0,1)$.

Rules of the game: After $n$ hands, probability for gambler to win $(n+1)^{\text{th}}$ hand is $X_{n}$. If he wins he gets half the capital the croupier held after the $n^{\text{th}}$ hand. If he loses, he gives the croupier half of his own capital. Let $$\mathcal{F}_{n}=\sigma\left(X_{i},1\leq i \leq n\right)$$


Questions:

How do I show that $\left(X_{n}\right)_{n\geq 0}$ is a $\left(\mathcal{F_{n}}\right)_{n\geq 0}$ martingale, converging to a limit $Z$ both a.s and in $L^2$?

Also I need to show that: $$\mathbb E[X_{n+1}^{2}]=\frac{\mathbb E[3X_{n}^2+X_n]}{4}$$

and $$\mathbb E[Z^2]=\mathbb E[Z]=p$$

For any $n\geq 0$, let $Y_n=2X_{n+1}-X_{n}$ Find conditional law of $X_{n+1}$ knowing $\mathcal{F}_{n}$, and prove that $$\mathbb P(Y_n=0|\mathcal{F_n})=1-X_n$$

$$\mathbb P(Y_n=1|\mathcal{F}_n)=X_n$$

And express the law of $Y_n$.

Let $G_n=\{Y_n=1\}$, $P_n=\{Y_n=0\}$ prove $Y_n\rightarrow Z$ a.s., and deduce that $$\mathbb P(\liminf_{n\rightarrow\infty}G_n)=p$$

$$\mathbb P(\liminf_{n\rightarrow\infty}P_n)=1-p$$

Lastly, are the variables $\{Y_n,n\geq 1\}$ independent? I would appreciate any and all solutions or hints.

1

There are 1 best solutions below

1
On BEST ANSWER

You've asked for an exhaustive treatment of this martingale. I've attempted below to get it all.

There is a preliminary point to make before pursuing any properties. Let $A_n$ denote the event that the gambler wins the $n$-th hand. The way in which I interpret your rules is $$ X_{n+1} = \frac{1}{2}(1 + X_n) ~~ \text{on }A_{n+1}\\ = \frac{1}{2} X_n ~~ \text{on }A_{n+1}^c $$ and $P(A_{n+1} \mid \mathcal{F}_n) = X_n$. This last line is important for obvious reasons, and says literally that the gambler wins on hand $n+1$ with likelihood $X_n$, when $X_n$ is known.

  1. $X_n$ is a martingale. We compute $$ E[X_{n+1} \mid \mathcal{F}_n] = E[X_{n+1} I_{A_{n+1}} \mid \mathcal{F}_n] + E[X_{n+1} I_{A_{n+1}^c} \mid \mathcal{F}_n] \\ = E[\frac{1}{2}(1 + X_n) I_{A_{n+1}} \mid \mathcal{F}_n] + E[\frac{1}{2}X_n I_{A_{n+1}^c} \mid \mathcal{F}_n] \\ = \frac{1}{2}(1 + X_n) P(A_{n+1} \mid \mathcal{F}_n) + \frac{1}{2} X_n P(A_{n+1}^c \mid \mathcal{F}_n) \\ = \frac{1}{2}(1 + X_n) X_n + \frac{1}{2} X_n(1 - X_n) = X_n $$ where from the second to third lines I used the rule 'taking out what is known'. This is the martingale property, as desired.

2.$E[X_{n+1}^2] = \frac{E[3 X_n^2 + X_n]}{4}$.

I'll prove this in a stronger form: $$ E[X_{n+1}^2 \mid \mathcal{F}_n] = E[X_{n+1}^2 I_{A_n} \mid \mathcal{F}_n] + E[X_{n+1}^2 I_{A_n^c} \mid \mathcal{F}_n] \\ = E[\frac{1}{4} (1 + X_n)^2 I_{A_n} \mid \mathcal{F}_n] + E[\frac{1}{4} X_n^2 I_{A_n^c} \mid \mathcal{F}_n] \\ = \frac{1}{4} (1 + X_n)^2 X_n + \frac{1}{4} X_n^2(1 - X_n) \\ = \frac{3 X_n^2 + X_n}{4} $$ From the second to the third lines, I took out what is known and plugged in $P(A_{n+1} \mid \mathcal{F}_n) = X_n$ in one shot. The result you want now follows from taking expectations.

3.$X_n$ converges almost surely and in $L^2$ to a random variable $Z$ for which $E[Z] = E[Z^2] = p$.

Note that your martingales are nonnegative and bounded from above by $1$. Therefore the MG convergence theorem says that $X_n$ converges in $L^2$ and almost surely. This is well-known and general and I won't discuss this theorem any further, except to point you to wikipedia.

Now it is also the case that $X_n$ is uniformly integrable, being bounded uniformly in $n$ by the constant function $1$. Therefore we use the MCT and the bounded convergence theorem to get $$ E[X_n] \rightarrow E[Z] $$ However $X_n$ is an MG, hence $E[X_n] = p$ for all $n$, and $E[Z] = p$ as desired.

To show that $E[Z^2] = p$ we'll have to be more clever. Iterating the identity I proved in the answer to 2, it is possible to show that for all $m < n$, $$ E[X_n^2 \mid \mathcal{F}_m] = (\frac{3}{4})^{n-m} X_m^2 + \frac{1}{4} \left( 1 + \frac{3}{4} + \cdots + \left( \frac{3}{4}\right)^{n-m-1}\right) X_m $$ First, let $m = 0$, and second, take the limit as $n \rightarrow \infty$. We obtain that $$ E[Z^2] = \lim_n E[X_n^2] = 0 + \frac{p}{4} \sum_{k = 0}^{\infty} \left(\frac{3}{4} \right)^k = \frac{p}{4} \frac{4}{1} = p $$ using the $L^2$ convergence of the MCT (or bounded convergence, depending on your mood today).

4.With $Y_n = 2 X_{n+1} - X_n$, find the law of $Y_n$, show that $Y_n \rightarrow Z$ almost surely, and compute $P(\liminf_n G_n) = p, P(\liminf_n P_n) = 1-p$.

You should realize by now that $\{Y_n = 1\} = A_{n+1}$ is precisely the event that the Gambler has won the $n+1$-th hand. To find the law, recall that $$ P(A_{n+1} \mid \mathcal{F}_n) = X_n \\ P(A_{n+1}) = E[X_n] = p $$ by martingality. Thus is the law of $Y_n$: a biased coin toss with likelihood $p$.

That $Y_n \rightarrow Z$ almost surely follows immediately from $X_n \rightarrow Z$.

Notice now that $Y_n$ is $\{0,1\}$-valued, and so $Z$ is almost surely $\{0,1\}$ valued as well. Realize also that $\liminf_n G_n$ is the event "always winning after a certain time" and $\liminf_n P_n$ is the event "always losing after a certain time", so we see that $$ \liminf_n G_n \subset \{Z = 1\} \\ \liminf_n P_n \subset \{Z = 0\} $$ Now we'll do something cheap: $$ E[Z] = p = 1 \cdot P(Z = 1) + 0 \cdot P(Z = 0) = P(Z = 1) $$ Now what remains is to argue that $\liminf_n G_n$ has full measure inside $\{Z = 1\}$. To see this, think about what it would mean if on a positive measure set inside $\{Z = 1\}$ we had $Y_n = 0$ infinitely often. This last part will finish item 4.

5.Are the $Y_n$ independent?

No, I don't think so. This is a bit counterintuitive, but that's life. You can compute $E[Y_0Y_1]$ by hand and see that it's not equal to $p^2$ by considering $E[Y_0Y_1 \mid \mathcal{F}_1]$ and then computing an expectation.