I read a theorem that an adapted, increasing, (locally) integrable integrable, variation process is a (local) submartingale. (here increasing includes right continuity).
Definition: A process is $\textbf{increasing}$ if each sample path is increasing and right continuous
Definition: A process has $\textbf{finite variation}$ if it is the difference of two increasing processes
Definition: Let $X$ be a process with finite variation. We define the variation process $V_X$ by setting $V_X(t,\omega)$ to be the variation of $X(\cdot, \omega)$ on $[0,t]$
Definition: A process is said to be $\textbf{integrable}$ if $\sup_{t\geq 0 } \mathbb{E}|X_t|<\infty$
Note: For an increasing, nonnegative process $X$ is integrable is equivalent to $X_{\infty}$ existing a.s. and being an integrable random variable.
Definition: A finite variation process $X$ is said to have $\textbf{integrable variation}$ if $V_X$ is integrable.
Definiton: A finite variation process $X$ is said to have $\textbf{locally integrable variation}$ if there exists a sequence of stopping times $(T_n)_{n \in \mathbb{N}}$ such that $T_n \nearrow \infty$ a.s. and for each $n \in \mathbb{N}$, $X^{T_n}$ has integrable variation
Put $A$ to be our increasing process with locally integrable variation. So I know the adapted condition is satisfied, as is the increasing conditional expectation condition. But I cannot see how locally integrable variation implies for each $t \geq 0$, $\mathbb{E}|A_t| < \infty$. The proof in the book (Klebner) is short but mentions (regardless of integrable or locally integrable variation) that it has to do with the localizing sequence. Since I'm new to localized properties, could someone explain?
From assumption, we can choose our localizing sequence $(T_n)_{n \in \mathbb{N}}$ such that $T_n \nearrow \infty$ a.s. and for each $n \in \mathbb{N}$, $\sup_{t \geq 0} \mathbb{E} [{V_A}_{t \wedge T_n}] < \infty$