Sum of random variables, one of which converges in probability

232 Views Asked by At

Context: I'm trying to understand a proof of Wigner's theorem in An Introduction to Random Matrices by Anderson et al. and I'm having trouble proving an intermediate fact in the proof that the book glosses over.

The general form of my question is as follows. Let $\{X_n\}$ and $\{Y_n\}$ be sequences of non-negative random variables such that $Y_n\rightarrow 0$ in probability as $n\rightarrow\infty$. I would like to show that $$\lim_{n\rightarrow\infty} P(X_n + Y_n > \epsilon) = \lim_{n\rightarrow\infty} P(X_n > \epsilon)$$ for any $\epsilon > 0$. (Note that $X_n$ and $Y_n$ are not assumed to be independent.) This is actually a slightly more general result than what I need, but it seems like it should be true.

I've tried to write the probability on the LHS as an integral so I can use a dominated convergence argument, but I'm having trouble writing the joint law down in a way which is useful for getting a handle on the integral.

1

There are 1 best solutions below

0
On

After some more thought, it turns out this statement is false. We can take $X_n = 1$, $Y_n = 1/n$, both with probability $1$, and $\epsilon = 1$. Then $P(X_n + Y_n > 1) = 1$ for all $n$ but $P(X_n > 1) = 0$ for all $n$, so the two limits do not match.