Let $(\Omega, \mathcal{F},(\mathcal{F}_t)_{t=0,1,...,T},P$) be a filtered probability space. Let $X=(X_t)_{t=0,1...,T}$ an adapted process (to $\mathcal{F}_t=\sigma(X_0,...,X_t)$) that is integrable (i. e. $E(|X_t|)<\infty$), $t=0,1,...,T$. What is the largest stopping time $\tau: \Omega \to \{0,1,...,T\}$ such that the stopped process $X^{\tau}$ is a submartingale?
My ideas
- I know that the stopped process is also adapted
- I know about the Optional Sampling theorem.
"largest" stopping time means that for every other stopping time $\tau'$ for which $X^{\tau'}$ is a submartingale we have $P(\tau'\leq \tau')=1$, so it is "almost certain" the case.
I was given a hint elsewhere to look at the Doob decomposition theorem. But looking at the doob decomposition, how can I derive the largest stopping time?
$\require{cancel}$ Hint : By Doob's decomposition, we know that for all times $t$, we can write $$X_t = M_t + A_t $$
Where $M$ is a martingale and $A$ is integrable, predictable and satisfies $A_0 = 0$. Now let $\tau$ be any stopping time : by definition, $X^\tau$ is a submartingale if and only $\mathbb E[X_{t+1\wedge\tau}\mid \mathcal F_t]\ge X_{t\wedge\tau} $, that is, if and only if
$$\mathbb E[M_{t+1\wedge\tau}\mid \mathcal F_t] + \mathbb E[A_{t+1\wedge\tau}\mid \mathcal F_t] = \cancel{M_{t\wedge\tau}}+ \mathbb E[A_{t+1\wedge\tau}\mid \mathcal F_t]\ge \cancel{M_{t\wedge\tau}} + A_{t\wedge\tau}, \ \ \forall t\in\{ 0,\ldots,T-1 \} $$
Where we have used linearity of conditional expectation and the optional stopping theorem.
Now because $A$ is predictable, it holds that for all $s\le t+1$, $A_s$ is $\mathcal F_t$ measurable. This implies that the above sufficient and necessary condition for $X^\tau$ to be a submartingale can equivalently be restated as $$ A_{t+1\wedge \tau} \ge A_{t\wedge\tau}, \ \ \forall t\in\{ 0,\ldots,T-1 \} \tag1 $$
Can you find a stopping time which makes this condition hold ? (And can you check that it is indeed the largest such stopping time ?)
Added : Ok, let me give a solution based on my hint above and what I wrote in the comments. But before that, another (hopefully) "intuitive" explanation of how $\tau$ may be found : Doob's theorem tells us that we can write $X$ as a sum of a martingale $M$ and some arbitrary predictable process $A$. If we want $X^\tau$ to be a submartingale, that means we want $X^\tau$ to be nondecreasing "in expectation". However, $M$ is "constant", so really, the only way to make $X^\tau$ a submartingale is to pick $\boldsymbol\tau$ such that $A^\tau$ is almost surely non-decreasing.
Well, so far it seems like I just restated $(1)$ in a very unprecise fashion, but think about it : what stopping time (which can involve all the $A_t$'s up to $A_{t+1}$) can guarantee that $A^\tau$ is non-decreasing ? That would precisely be "the first instant $t$ where the sequence $(A_t)$ starts to decrease", or equivalently "the last instant $t$ up to which $(A_t)$ is non-decreasing". Yes, it's really that simple ! More formally, we can let $\tau$ be defined as
$$\tau(\omega) := \begin{cases}\inf\{t=0,\ldots, T-1 : A_{t+1}(\omega)< A_t(\omega)\},\ \text{ if the inf is attained},\\ T \text{ otherwise}\end{cases}$$ (that splitting in two cases is purely a technicality due to the fact that $A_{T+1}$ is not defined.)
If you followed the above explanation, you may have an intuitive idea of why $\tau$ has to be the solution to our problem, but we still need to check that formally :
Hope that helps !