Chebyshev's inequality application and convergence - practical example

1.1k Views Asked by At

Let $W_n$ be a random variable with mean $\mu$ and variance $\frac{b^2}{n^{2p}}$, with $p>0$ and $b$ and $\mu$ constants.

Show that $$ \lim_{n\to\infty} P(|W_n-\mu| \leq \epsilon) = 1 $$

The solution applies Chebychevs: $$ P(|W_n-\mu| \leq \epsilon) = 1 - P(|W_n-\mu| > \epsilon) $$ $$ \geq 1 - P(|W_n-\mu| \geq \epsilon) $$ $$ = 1 - P(|W_n-\mu| \geq \frac{\epsilon n^p}{b} \frac{b}{n^p}) $$ $$ \geq 1 - \frac{1}{(\frac{\epsilon n^p}{b})^2}$$

and then taking the limit $n\to\infty$.

I do not understand the change of signs in the second line?

Also where does $\frac{\epsilon n^p}{b} \frac{b}{n^p} $ in the third line come from?

Does this imply $W_n$ converges in probability to $p$ ?

$$ \lim_{n\to\infty} P(|W_n-\mu| \geq \epsilon) = 1 - \lim_{n\to\infty}P(|W_n-\mu| < \epsilon) \leq 1 - \lim_{n\to\infty} P(|W_n-\mu| \leq \epsilon) = 0$$

So, this i do not understand. An explanation in plain english of this. This is the first time i worked on exercises for convergence.

1

There are 1 best solutions below

3
On BEST ANSWER
  • $P(|W_n-\mu| \leq \epsilon) = 1 - P(|W_n-\mu| > \epsilon)$ is saying that the probability $W_n$ being strictly within $\epsilon$ of $\mu$ is the same as the probability that the difference is not $\epsilon$ or more.

  • $1 - P(|W_n-\mu| > \epsilon) \ge 1 - P(|W_n-\mu| \geq \epsilon)$ is that the probability of the difference between $W_n$ and $\mu$ not being $\epsilon$ or more is greater than or equal to the probability that the difference is not strictly more than $\epsilon$.

  • Using $\epsilon = \frac{\epsilon n^p}{b} \frac{b}{n^p}$ gets the variance of $\frac{b}{n^p}$ into the expression and so allows Chebyshev's inequality to be applied

  • You can apply limits to the probability inequalities providing that the limits exist. Since there is a lower bound on the probabilities of $1 - \frac{1}{(\frac{\epsilon n^p}{b})^2}$ which approaches $1$ as $n$ increases, these limits do exist and are $1$