Summation : bounded in probability

25 Views Asked by At

I am trying to prove that there exists a $M$ such that for every $e$:

$$\sum_{i = n}^n P(|X_i - Y_i|> M) < e$$

By Markov inequality and by the fact that $E|X_i - Y_i| \leq c (= O(1))$ uniformly for every $i$, we have that

$$\sum_{i = n}^n P(|X_n - Y_n|> M) < cnM^{-1}$$.

Now:

  • if $e = 1$, I would set M = 2nc,
  • if $e = 10$ then, $M = \frac{1}{10}nc$, etc...

Generally, if somebody gives me a $e$ I would choose $M = \frac{1}{e}nc$. However, it feels incorrect, especially if $n$ tends to $\infty$. What I am missing?