I am just learning about martingales and discrete stochastic integrals. For completeness, we define a discrete stochastic integral as $$Y_n=\sum_{k=1}^nC_k(X_k-X_{k-1})=(C\bullet X)_n$$ for $(X_n)$ adapted and $C_n$ previsible in this context.
An exercise from this section says we can use a discrete stochastic integral to show that if $(X_n)$ is submartingale and $S, T$ are a.s. bounded stopping times with $S\le T$ then $E(X_S)\le E(X_T)$.
Because of the boundedness of the stopping times, I thought this looked more like an optional stopping problem, but the exercise clearly states that this can be shown using a suitable discrete stochastic integral.
Does someone know how the stochastic integral relates to this problem?
EDIT: The only result I already have for the discrete stochastic integral is if $C_n\ge 0$ then $(C\bullet X)_n$ is a submartingale as well. Perhaps I need to define a discrete stochastic integral and then apply optional stopping to that?
The result for the discrete stochastic integral that you mention, is exactly what is required. Furthermore the required stochastic integral is also easy to construct.
Let's take two a.s. bounded stopping times $S \leq T$. There is an implicit filtration w.r.t which $X_n$ is adapted and a submartingale, that I call as $\mathcal F_n$.
First, note that $X_S$ and $X_T$ are well-defined random variables, because $S$ and $T$ are stopping times. Since $T$ is a.s. bounded there is a natural number $M>0$ such that $T \leq M$ a.s. and because of this, we know that $|X_T|= \sum_{i=0}^M 1_{\{T = i\}} |X_i|$ a.s. , but the RHS is integrable by linearity and has integral at most equal to $ME[|X_0|] < \infty$ since $X_0$ is part of a submartingale hence is by definition integrable. Therefore, $E[|X_T|]< \infty$ and so $E[X_T] < \infty$.
In a similar fashion, $E[X_S] < \infty$.
Now, like the optional stopping theorem, we introduce a $t_0 > 0$, and wish to prove $E[X_{T \wedge t_0}] \geq E[X_{S \wedge t_0}]$ before taking $t_0 \to \infty$ and ensuring things work out.
To do this, we prove that $X_{T \wedge t_0} - X_{S \wedge t_0}$ equals a discrete stochastic integral of $X_n$ by a non-negative random variable. Hence it will be a sub-martingale, and we can use the optional stopping theorem.
But such a sub-martingale is easy to create : let $C_n = 1_{T \geq n} - 1_{S \geq n}$. Then note that $\sum_{i=1}^{t_0} C_i (X_{i}-X_{i-1}) = X_{T \wedge t_0} - X_{S \wedge T_0}$ by an easy $\omega$-by-$\omega$ check. $C_n$ is non-negative a.s. since $S \leq T$, and $C_n$ is previsible since each of $T \geq n$ and $S \geq n$ are $\mathcal F_{n-1}$ measurable, and $C_n$ is just a linear combination of their indicators.
Therefore, the conclusion is that $Y_{t_0} = X_{T \wedge t_0} - X_{S \wedge t_0}$ is a well defined submartingale, with $Y_0 = 0$. But then, for a submartingale it is well known that the sequence $E[Y_n]$ is an increasing sequence. Therefore $E[Y_0] \leq E[Y_n]$, which gives us that $E[X_{T \wedge t_0}] \geq E[X_{S \wedge t_0}]$ for all $t_0$.
Finally, note that as $t_0 \to \infty$, the a.s. boundedness of $T$ and $S$ imply that for $t_0 >M$ we have $T \wedge t_0 = T$ and $S \wedge t_0 = S$ a.s. , so the pointwise limits $X_{T \wedge t} \to X_T$ and $X_{S \wedge t} \to X_S$ hold a.s.
It is now about the dominated convergence theorem, but then that is clear since $|X_{T \wedge t}| \leq \sum_{i=0}^M 1_{\{T=i\}} |X_i|$ a.s. , and the latter has finite expectation. Similarly for $S$. Now we conclude that $E[X_T] \geq E[X_S]$.
In continuous time, more regularity must be imposed on both the filtration and the process itself if we are to obtain a result of the kind above. But it does hold.