How the authors conclude ${Y_{S}} =\phi_{N}\left({X_{S}}\right)$ in the proof of Optional Sampling Theorem?

49 Views Asked by At

I'm reading about martingale and stopping time from my lecture note. The authors first start with a proposition and a corollary:

enter image description here enter image description here

and then my proposition of interest:

enter image description here

The authors said that $\mathbb{E}\left[\phi_{N}\left(X_{0}\right)\right] \le \mathbb{E}\left[\phi_{N}\left({X_{S}}\right)\right]$ is obtained by applying Corollary 66 to the stopping times $0$ and $S$.

My understanding: Let $Y_n = \phi_N (X_n)$. Then by Proposition 33(ii), we get $(Y_n)$ is a bounded sub-martingale. By Corollary 66, we get $(Y_0, Y_S)$ is a sub-martingale w.r.t $(\mathcal F_0, \mathcal F_S)$. Hence $\mathbb E [Y_0] \le \mathbb E [Y_S]$ or equivalently $\mathbb{E}\left[\phi_{N}\left(X_{0}\right)\right] \le \mathbb E [{Y_{S}}]$.

My question: How can the authors conclude that ${Y_{S}} =\phi_{N}\left({X_{S}}\right)$?

Many thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

${Y_{S}} =\phi_{N}\left({X_{S}}\right)$ is actually correct with the following reasoning $$\begin{aligned} Y_S &= Y_S\sum_{n} \mathbf 1_{\{S=n\}}\\ &= \sum_{n} Y_S \mathbf 1_{\{S=n\}}\\ &= \sum_{n} Y_n \mathbf 1_{\{S=n\}} \\&= \sum_n \phi_N(X_n) \mathbf 1_{\{S=n\}} \\ &= \sum_n \phi_N(X_S) \mathbf 1_{\{S=n\}} \\&=\phi_N(X_S) \sum_n \mathbf 1_{\{S=n\}} \\ &= \phi_N (X_S) \end{aligned}$$