Given the sequence of random variables $\{X_n: n \ge 1\} $ characterized by $P[X_n = 1/n] = 1 - 1/n^2 $ and $P[X_n = n] = 1/n^2, $ I wanted to prove certain convergence results. It is not difficult to show that $X_n $ does not converge in L2 to any constant. But, how does one deal with convergence in probability? I am used to the example where $ P[Z_n=0] = 1-1/n^2 $ and $P[Z_n=n] = 1/n^2 $ in which case one shows that $Z_n \stackrel{P}{\longrightarrow} 0, $ but for the example given my usual setup to prove convergence in probability runs into some issues. When I write $P[X_n > \epsilon], $ then the value of this probability depends on whether $n $ is such that 1/n < \epsilon $ or not.
Clearly, one can prove that $Y_n := X_n - 1/n \stackrel{P}{\longrightarrow} 0. $ Still, proving or disproving convergence in probability to $0, $ for my original sequence of random variables, $\{X_n: n \ge 1\}, $ which I think is the only possibility of convergence in probability, escapes me. Any hint/suggestion welcome.
Thank you
Maurice
Recall that in order to show that $X_n \to 0$ in probability, you must show that for every $\epsilon > 0$, we have $P(|X_n| > \epsilon) \to 0$.
Now go back to the definition of sequence convergence: you have to show that for any $\delta > 0$ there exists $N$ such that for all $n \ge N$, we have $P(|X_n| > \epsilon) < \delta$. Can you see how to choose such an $N$? Keep in mind that $N$ is allowed to depend on both $\delta$ and $\epsilon$!