Form of Markov Inequality proof

72 Views Asked by At

Prove the following lemma for any nonnegative random variable X and real number ${\epsilon}$ > 0.

$$ \sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon)<\frac{1}{\epsilon} \mathbb{E}(X) \leqslant 1+\sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon) $$

  1. Define $Y$:
    $Y$ = $n\epsilon$ if $ n\epsilon < X \leq (n+1)\epsilon$, $ 0 $ if $ X = 0$
  2. Expectation of Y:
    $ \mathbb{E}(Y) = \epsilon \sum_{n=1}^{\infty} n \mathbb{P}(X > n\epsilon)$
  3. Lower Bound:
    $ \mathbb{E}(Y) \leq \mathbb{E}(X) \Rightarrow \sum_{n=1}^{\infty} \mathbb{P}(X > n\epsilon) < \frac{1}{\epsilon} \mathbb{E}(X)$
  4. Upper Bound:
    $\mathbb{E}(Y) \geq \epsilon \sum_{n=1}^{\infty} \mathbb{P}(X > n\epsilon) \Rightarrow \frac{1}{\epsilon} \mathbb{E}(X) \leq 1 + \sum_{n=1}^{\infty} \mathbb{P}(X > n\epsilon)$

Result:
$$ \sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon)<\frac{1}{\epsilon} \mathbb{E}(X) \leqslant 1+\sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon) $$

2

There are 2 best solutions below

0
On
  1. First Inequality: Using the fact that $\mathbb{E}(X)$ = $\int_{0}^{\infty} \mathbb{P}(X > x) \,dx$, we can partition the integral: $$ \mathbb{E}(X) = \sum_{n=1}^{\infty} \int_{(n-1)\epsilon}^{n\epsilon} \mathbb{P}(X > n\epsilon) \,dx \leq \frac{1}{\epsilon} \sum_{n=1}^{\infty} \epsilon \mathbb{P}(X > n\epsilon) $$ Thus: $$ \sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon)<\frac{1}{\epsilon} \mathbb{E}(X) $$

  2. Second Inequality: We can write: $$ \frac{1}{\epsilon} \mathbb{E}(X) = \frac{1}{\epsilon} \left( \int_{0}^{\epsilon} \mathbb{P}(X > x) \,dx + \int_{\epsilon}^{\infty} \mathbb{P}(X > x) \,dx \right) \leq 1 + \sum_{n=1}^{\infty} \mathbb{P}(X > n\epsilon) $$ Thus: $$ \frac{1}{\epsilon} \mathbb{E}(X) \leqslant 1+\sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon) $$ $$\sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon)<\frac{1}{\epsilon} \mathbb{E}(X) \leqslant 1+\sum_{n=1}^{\infty} \mathbb{P}(X>n \epsilon) $$

0
On

(Method of Indicators)

The the terms of the series $\sum_{n=1}^\infty 1_{\{X > n\epsilon\}}(\omega)$ are all $1$ for $n\ge 1$ with $n<X(\omega)/\epsilon$, but $0$ for all larger $n$. Therefore $$ \sum_{n=1}^\infty 1_{\{X > n\epsilon\}}(\omega)\le X(\omega)/\epsilon. $$ This is an equality unless $X(\omega)/\epsilon$ is an integer. Consequently, adding in a $1$ to account for this possibility, we also

have $$ X(\omega)/\epsilon \le 1+\sum_{n=1}^\infty 1_{\{X > n\epsilon\}}(\omega). $$ Writing this as a pair of inequalities between random variables: $$ \sum_{n=1}^\infty 1_{\{X > n\epsilon\}}\le X/\epsilon\le 1+\sum_{n=1}^\infty 1_{\{X > n\epsilon\}}.\qquad\qquad (1) $$ But $$ \Bbb E\left[\sum_{n=1}^\infty 1_{\{X >n\epsilon\}}\right] =\sum_{n=1}^\infty \Bbb E[1_{\{X >n\epsilon\}}] =\sum_{n=1}^\infty \Bbb P[X >n\epsilon], $$ so the assertion follows upon taking expectations in (1).