In a lecture notes I read about the following:
Lemma If $X_n-X$ convergence in mean square to $0$, then $X_n-X$ converges in probability to $0$.
Proof By Chebyshev’s inequality, $Pr(|X_n-X|>\epsilon)\leq\frac{E(X_n-X)^2}{\epsilon^2}$. Mean square convergence to 0 implies $\frac{E(X_n-X)^2}{\epsilon^2}\rightarrow 0$ as $n\rightarrow\infty$. Thus $X_n-X$ converges in probability to $0$.
However, this does not look correct to me, as Chebyshev's inequality says $Pr(|Y-EY|>\epsilon)\leq\frac{Var(Y)}{\epsilon^2}$, so if we apply Chebyshev's inequality, we could only get $Pr(|X_n-X-E(X_n-X)|>\epsilon)\leq\frac{Var(X_n-X)}{\epsilon^2}$, because we don't know if $E(X_n-X)=0$ or not. Am I right? If this proof does not work. How should we correctly prove it?
Chebyshev’s inequality is actually a straightforward derivation of Markov's inequality $\mathbb P(\vert Z\vert>\varepsilon)\le\frac{\mathbb E[\vert Z\vert]}{\varepsilon}$, applied with $Z=(Y-\mathbb E[Y])^2$. In your case, just apply Markov's inequality with $Z=(X_n-X)^2$ instead.