Conditions for convergence in probability of a sum

4.6k Views Asked by At

A sequence $\{X_n\}$ of random variables converges in probability towards the random variable $X$, Similarly, $\{Y_n\}$ converges in probability to $Y$. Under which conditions can I say that $\{X_n - Y_n\}$ converges to $X-Y$? Is there a theorem giving sufficient condition for this? Given that the examples of cases where this is not true seem very knife-edge.

1

There are 1 best solutions below

3
On BEST ANSWER

Fix $\varepsilon,\delta>0$. By assumption there exists $N_0$ big enough so that $P(|X-X_n|>\varepsilon/2)<\delta/2$ and $P(|Y-Y_n|>\varepsilon/2)<\delta/2$ for all $n>N_0$. Then, for all $n>N_0$ \begin{align*}P(|X-Y-(X_n-Y_n)|>\varepsilon)&\leq P(|X-X_n|+|Y-Y_n|>\varepsilon)\\ &\leq P(|X-X_n|>\varepsilon/2)+P(|Y-Y_n|>\varepsilon/2)<\delta. \end{align*}