Convergence in probability exercise

200 Views Asked by At

I am working through MIT's 18.650 Stats course and I was trying to do the problem sets. I am stuck on the first problem. For any $n\in N^*$, let $X_n$ be a random variable such that $P[X_n=1/n]=1-1/n^2$ and $P[X_n=n]=1/n^2$. Does $X_n$ converge in probability? In $L^2$? I know the definition for convergence in probability, but I don't know what to replace the $X$ in the definition with. I am really stuck here and I would appreciate some help. Thanks.

1

There are 1 best solutions below

2
On

Let $n > \epsilon^{-1}$. Then since $X_n$ can only take on the values $n$ and $\frac{1}{n}$, the event that $X_n > \epsilon$ is precisely the event that $X_n = n$ (because $\frac{1}{n} < \epsilon$ by definition). Hence,

$$P(|X_n| > \epsilon) = P(X_n > \epsilon) = P(X_n = n) = \frac{1}{n^2} \to 0$$ So, $X_n \xrightarrow{P} 0$.

Next, $$\int |X_n|^2 dP = \frac{1}{n^2} (1 - \frac{1}{n^2}) + 1 \to 1$$ So, $X_n$ does not converge to $0$ in $L^2$.

Since convergence in $L^2 \implies$ convergence in probability, and the limit is unique for convergence in probability, $X_n$ does not converge to anything in $L^2$.