Gambling Game martingale

585 Views Asked by At

State the optional sampling theorem for martingales and bounded stopping times.

You start with a capital of £100 and bet repeatedly on the toss of a coin. On each toss you may bet any whole number of pounds up to your current capitaL If you win, you get back twice your bet and if you lose, nothing. Thus, if $X_1, X_2 , $... are IID random variables with $$P(X_i= 1)= P(X_i= -1)=\frac{1}{2}$$ and the amount $a_k = a_k(X_1,... ,X_{k-1})$ that you bet at time $k$ depends only on $X_1, ... ,X_{k-1}$ , then your capital at time $n$ is

$$Cn =100+\sum\limits_1^n2a_k(X_1,...,X_{k-1})X_k.$$

Show that $C_n$ is a martingale with respect to the filtration $\sigma(X_1),\sigma(X_1, X_2 ),...$

Your aim is to achieve a fortune of £1000 before going bust. Let $T$ be the first time that your capital either exceeds £1000 or drops to £0. Show that $T$ is a stopping time with respect to the same filtration.

You may assume that $T$ is finite almost surely. Show that for each n, $$EC_{T\wedge n}= 100$$ and deduce that $$EC_T = 100.$$ Show that your probability p of achieving your aim satisfies

$$p\leq100$$

and that $p =\frac{1}{10}$ to provided that you never bet more than the gap between your current capital and 1000: ie. that $a_k \leq 1000 - C_k$ for all k.

Explain the relevance of this to your choice of gambling strategy.


I have done the first part as it's just bookwork about the optional sampling theorem. I have not seen how to del with sigma algebras in this manner, I tried googling and that didn't help. Your help are very much appreciated.

1

There are 1 best solutions below

10
On

Are the $X_i$'s suppose to represent win/loss?

Anyway, if $X_1$ was the result of a coin toss, we would have $\sigma(X_1) = (\emptyset, \Omega, T, H)$ so $\sigma(X_1) \neq (\emptyset, (-1,1), -1, 1)$.

Recall that $\sigma(X_1) = (X^{-1}(B)|B \in \scr{B})$ is the collection of preimages of $X_1$ (I like to think of it as the set of events that determine the value of $X_1$ union with ($\emptyset, \Omega$)).

Thus, $\sigma(X_1) = ((\text{win}),(\text{loss}),\Omega,\emptyset)$

Since $C_1 = 100 +2a_1(X_0)X_1$, I think the events on which the values of $C_1$ depend are the same as the ones on which the values of $X_1$ depends. I guess $X_0$ would be zero or some constant (so that $\sigma(X_0)$ is the trivial sigma-algebra).

Similarly, $C_2 = 100 +2a_1(X_0)X_1 + 2a_2(X_0,X_1)X_2$ (if $X_0$ is constant, you omit it). $C_2$'s values depend on $X_1, X_2$.

To prove this formally, I guess, you could say that:

For $n=1$,

$C_1 = 100 + 2a_1(X_0)(1)$ for a win and $= 100 + 2a_1(X_0)(-1)$ for a loss.

If a Borel set B contains $100 + 2a_1(X_0)(1)$ and not $100 + 2a_1(X_0)(-1)$, then, $C_1^{-1}(B)$ = {win}

If a Borel set B does not contain $100 + 2a_1(X_0)(1)$ but contains $100 + 2a_1(X_0)(-1)$, then, $C_1^{-1}(B)$ = {loss}

If B contains both, $C_1^{-1}(B) = \Omega$.

If B contains neither, $C_1^{-1}(B) = \emptyset$.

Thus, $\sigma(C_1) = (C_1^{-1}(B)|B \in \scr{B}) = ((win),(loss),\Omega,\emptyset)$.

However, $\sigma(C_1) = \sigma(X_1)$. Thus, $\sigma(C_1) \subseteq \sigma(X_1)$.

Can you do the proof for $\sigma(C_2)$ using the definition of a sigma-algebra generated by 2 random variables and then $\sigma(C_3)$ and so on until you can figure out the pattern to prove for all $\sigma(C_n)$ (or use induction or something. I'm not really sure hahahaha)?


To prove the martingale property (which by itself is not enough to prove $(C_n)_{n \in \mathbb{N}}$ is a martingale), we must show that:

$E(C_n|X_1, ..., X_s) = C_s \forall s < n$ and, since this is a discrete random process, $s \in \mathbb{N}$.

Btw, $E(C_n|X_1, ..., X_s) = E(C_n|\sigma(X_1, ..., X_s))$.

Anyway, $E(C_n|X_1, ..., X_s)$

= $E(100+\sum_{k=1}^{n} 2a_k(X_1,...,X_{k-1})X_k|X_1, ..., X_n)$

= 100 + $E(\sum_{k=1}^{n} 2a_k(X_1,...,X_{k-1})X_k|X_1, ..., X_n)$

Split up the sum from $k = 1$ to $s$ and then from $k = s + 1$ to $n$. Use the linearity of expectation to obtain $E($sum from $k = 1$ to $s | X_1, ..., X_n) + E($sum from $k = s + 1$ to $n | X_1, ..., X_n)$.

For the part from $k = 1$ to $s$, we have

$E($sum from $k = 1$ to $s | X_1, ..., X_n) =$ sum from $k = 1$ to $s$ (yeah, without the expected value)

since $X_k$ is $\sigma(X_1, ..., X_n)-\text{measurable}$ (since obviously $\sigma(X_k) \subseteq \sigma(X_1, ..., X_n)$).

For the part from $k = s + 1$ to $n$, use independence to say:

$E($sum from $k = s + 1$ to $n | X_1, ..., X_n) = E($sum from $k = s + 1$ to $n) = 0$ since, as you said, $E(X_{\text{anything}}) = 0$.

The above holds whether or not $a_k$ is constant.

Usually whenever you prove things are martingales, there's a measurable part and an independent part.

If you want, you can check out some of my martingale-related questions last year:

Prove $A_t := W_t^3-3t W_t$ a martingale

Prove X is a martingale

https://quant.stackexchange.com/questions/14955/determine-ew-p-w-q-w-r

https://quant.stackexchange.com/questions/14956/show-that-eb-t-mathscrf-s-b-s