Proof Reference For Doob's First Stopping Time Theorem

212 Views Asked by At

I am reading through a proof of Doob's First (Bounded) Stopping Time Theorem, and I am not really following. Can somebody either provide a good reference, or be able to prove it?

For a martingale $X$, and for stopping times $S, T$, with $S \le T \le c$, where $c > 0$ is a uniform bound, we have that $\mathbb{E}[X_T | \mathcal{F}_S] = X_S$.

1

There are 1 best solutions below

5
On BEST ANSWER

This is an outline based on the presentation in Durrett's book Probability: Theory and Examples. Note that Durrett doesn't prove this result himself, instead relegating it to the exercises, but I'll try to sketch the main parts.

Here is the outline of the proof. $\newcommand{\E}{\mathbb E}\newcommand{\F}{\mathcal F}$

  1. $X_S,X_T\in L^1$. ($X_T\in L^1$ is needed for $\E[X_T\mid\F_S]$ to be well-defined.)
  2. $\E X_S= \E X_T$.
  3. For $A\in\F_S$ arbitrary, $U = \newcommand{\1}{\mathbb 1}S\1_A + T\1_{A^c}$ is a stopping time.
  4. Using $U$, we will strengthen the conclusion of 1. to $X_S= \E[X_T\mid \F_S]$.

Proof of 0. Since $T\le c$, and $T$ is integer-valued, $T\le \lfloor c\rfloor$, or in other words, we may as well assume that $c$ is itself an integer. We can write $X_T$ according to the value of $T$ as $$ X_T = \sum_{i=1}^c X_i\1_{\{T=i\}}.$$ Each of the variables in the sum is integrable, and the sum is finite since $c<\infty$, so the proof is immediate. The proof for $X_S$ is exactly the same. $\square$

Proof of 1. Consider the variable $H_n = \1_{\{S<n\le T\}}$. You can check that $H_n$ is a previsible process (meaning $H_n$ is $\F_{n-1}$-measurable for every $n$), hence $(H\cdot X)_n$ defined by $$(H\cdot X)_n:=\sum_{m=1}^nH_m(X_m-X_{m-1})$$ is also a martingale. From the definition, you can also see that we can express $(H\cdot X)_n$ more simply as $$(H\cdot X)_n = X_{n\wedge T}-X_{n\wedge S}$$ where $a\wedge b := \min\{a,b\}$. We have already noted that this is a martingale, and since $c\ge T\ge S$ a.s., we have $$ \E(X_T-X_S) =: \E(H\cdot X)_c = \E(H\cdot X)_0 = 0 $$ so $\E X_T= \E X_S$. $\square$

Proof of 2. Exercise.

Proof of 3. By the definition of $\E[X_T\mid \F_S]$, it suffices to show that $\E(X_S;A) = \E(X_T;A)$ for arbitrary $A\in\F_S$. Let $U$ be the stopping time from 2. Then it is clear that $S\le U\le T$. By the result of 1., we know $$\E X_S = \E X_U = \E X_T.$$ Decomposing each of the above three terms as $$ \E X_\ast = \E[X_\ast;A] + \E[X_\ast;A^c] $$ and simplifying finishes off 3., and the proof of the claim. $\square$

Remarks. If $H_n$ is $\F_{n-1}$ measurable for each $n$, then $H$ is sometimes called a predictable sequence instead of a previsible sequence. $H$ is sometimes referred to as a "gambling system" in the interpretation of $X_n$ as the net winnings of a (fair) game at time $n$ if you had bet one dollar each round. The process $H\cdot X$ is the $H$-transform of $X$ (or the Doob transform of $X$ by $H$, or a discrete stochastic integral of $X$, or ...). You can think of $(H\cdot X)_n$ as representing the net winnings at time $n$ if you bet $H$ dollars each round. From this perspective, cooking up stopping times to include into the definition of $H$ is very natural since you can imagine devising a sequence of bets for the subsequent rounds you play based on the information you have gained by the end of your last round.