Martingale inequality related to Kolmogorov's maximal inequality

266 Views Asked by At

The problem

This is a homework question from Durrett's Probability: Theory and Examples. Hints would be appreciated.

Let $\xi_i$ be independent with expected value zero and variances $\sigma^2_i < \infty$ (not necessarily all identical). Let $S_m = \xi_1 + \dots + \xi_m$. From Doob's inequality we can obtain Kolmogorov's maximal inequality, $$P\left(\max_{1 \leq m \leq n} |S_m| \geq x\right) \leq x^{-2} \operatorname{var} S_n.$$

I feel like I understand this derivation, which is given in the book.

Now the homework problem. Everything is as defined above, except we additionally impose the constraint $|\xi_i| \leq K\; a.s.$ I am asked to prove the following inequality: $$P\left(\max_{1 \leq m \leq n} |S_m| \leq x\right) \leq \frac{(x+K)^2}{ \operatorname{var} S_n}.$$ This differs from Doob's inequality in three ways:

  1. The inequality inside the probability is switched
  2. $x$ is replaced by $x+K$, which is the highest we can go if we stop as soon as we exceed $x$.
  3. The fraction on the right hand side is inverted.

I am told to use the fact that for a submartingale $X_m$ and a stopping time $N$ which is bounded by $n \; a.s.$, $$E X_0 \leq E X_N \leq E X_n.$$ There is also a hint to consider the martingale given by $X_m = S_m^2 - \operatorname{var} S_m$.

What I tried

Following the derivation of Doob's inequality from the submartingale inequality given above, I defined the set $$A = \{\max_{i \leq m \leq n} X_m \leq x^2 - \operatorname{var}S_n\} \supset \{\max |S_m| \leq x\}$$ and the stopping time $$N = \inf_m \{X_m \geq x^2 - \operatorname{var}{S_m}\text{ or }m = n\},$$ as well as several variations on the theme. In all my attempts, though, I'm unable to get an inverted fraction (realtive to Kolmogorov's maximal inequality) on the right-hand-side of the inequality.

Has anyone seen this? Any guidance?