I'm working through an example of the application of Doob's inequality in Durrett:
Let $X_m$ be a submartingale, and define $\bar{X}_n = \max\limits_{0 \leq m \leq n} X_m^+$.
Let $\lambda > 0$, and define A = $\{\bar{X}_n \geq \lambda\}$. Then
$\lambda P(A) \leq EX_nI_A \leq EX_n^+$.
The example uses this inequality to prove Komogorov's maximal inequality by:
$S_n = \xi_1 + \ldots + \xi_n $, where the $\xi_m$ are independent and have $E\xi_m = 0$, $\sigma_m^2 = E\xi_m^2 < \infty$. Since $g(x) = x^2$ is convex, then $X_n = S_n^2$ is a submartingale and using $\lambda = x^2$ we have:
$$P \left( \max\limits_{0 \leq m \leq n} |S_m| > x\right) \leq \frac{\operatorname{var}(S_n)}{x^2}.$$
I'm not following how this is achieved. It is clear to me that:
$P \left( \max\limits_{0 \leq m \leq n} (S_m^2)^+ > x^2\right) \leq \frac{ES_n^2I_A}{x^2}$
And that $ES_n^2 = \operatorname{var}(S_n)$, but shouldn't the expectation of $S_n^2$ change on $A$? I think this shows I am not fully appreciating the need for the definition of $A$ in Doob's inequality, although I can see how it is necessary in the proof.
Finally, are we just taking the square root within the probability to get
$$\{ \max\limits_{0 \leq m \leq n} (S_m^2)^+ > x^2 \} = \{ \max\limits_{0 \leq m \leq n} |S_m| > x \}.$$
Thanks for any help!