Proving convergence in prob of $X_n = X = Y_n$ using Markov's inequality.

37 Views Asked by At

Below question is from the book 'Probability course.com'. The book provides a solution using Chebyshev's inequality. Before reading that solution, I used Markov's inequality. Is my solution correct?

Question: Let $X$ be a random variable, and $X_n = X + Y_n$, where $E[Y_n] = \frac{1}{n}, V[Y_n] = \frac{\sigma^2}{n}$, where $\sigma > 0$ is a constant. Show that $X_n$ converges in probability to $X$.

Answer: By defn of convergence in probability, I must show that $\lim_{n \rightarrow \infty} P(|X_n - X| \ge \epsilon) = 0, \forall \epsilon > 0$.

$P(|X_n - X| \ge \epsilon) = P(|X + Y_n - X| \ge \epsilon) = P(|Y_n | \ge \epsilon) = P(Y_n \le -\epsilon) + P(Y_n \ge \epsilon) $.

My strategy was to use Markov's inequality on each of the summands of the last equality.

$ P(Y_n \le -\epsilon) = P(- Y_n \ge \epsilon) \le \frac{E[-Y_n]}{\epsilon} = \frac{-E[Y_n]}{\epsilon} = \frac{-1/n}{\epsilon} = \frac{-1}{n\epsilon} \rightarrow 0 \text{ as } n \rightarrow \infty.$

$ P(Y_n \ge \epsilon) \le \frac{E[Y_n]}{\epsilon} = \frac{1/n}{\epsilon} = \frac{1}{n\epsilon} \rightarrow 0 \text{ as } n \rightarrow \infty.$

Since both summands go to 0, we get the final result that $P(Y_n \le -\epsilon) + P(Y_n \ge \epsilon) \rightarrow 0 \text{ as } n \rightarrow 0$ proving that $X_n$ converges in prob to $X$ as required.

Is this solution correct? If not, where are the error please?

1

There are 1 best solutions below

1
On

To use Markov's inequality, you need to have a positive random variable. As far as we know, $Y_n$ can take positive and negative values. Even if it is a.s positive (or a.s negative), you can just use one time Markov's inequality so the two inequalities

$ P(Y_n \le -\epsilon) = P(- Y_n \ge \epsilon) \le \frac{E[-Y_n]}{\epsilon} = \frac{-E[Y_n]}{\epsilon} = \frac{-1/n}{\epsilon} = \frac{-1}{n\epsilon} \rightarrow 0 \text{ as } n \rightarrow \infty.$

$ P(Y_n \ge \epsilon) \le \frac{E[Y_n]}{\epsilon} = \frac{1/n}{\epsilon} = \frac{1}{n\epsilon} \rightarrow 0 \text{ as } n \rightarrow \infty.$

cannot be simultaneously true in general.