I'm trying to proof that if $\{X_n\}$ and $\{Y_n\}$ converge in probability to $X$ and $Y$ then $X_n Y_n$ converges in probability to $X Y$. I'm following the book "Modern Probability Theory" by B. Ramdas (pages 144-145).
At some point he states the following:
Since $\{X_n\}$ converges to $X$, which is bounded in probability $\{X_n\}$'s are uniformly bounded in probability
I understand that $X$ is bounded in probability and I proved it by contradiction using the fact that a random variable is finite out of a set of measure zero.
I don't see why the last part of it is true. The definition give for uniformly bounded is:
A sequence ${X_n}$ of random variables is said to uniformly converge in probability if $\forall \epsilon > $ 0 $ \exists $ a $ > 0 $ such that $ P(|X_n| > a) < \epsilon $ $\forall $ n $ $
This proof is important to me as it is the easiest to understand and the most elegant I've found using elementary concepts.
Given $\epsilon>0$, choose $b>0$ so large that $P(|X|>b)<\epsilon/2$. Next, because $X_n$ converges in probability to $X$, there is $N\in\Bbb N$ such that if $n\ge N$ then $P(|X-X_n|>1)<\epsilon/2$. But $|X_n|\le|X|+|X_n-X|$, so that $\{|X_n|>b+1\}\subset\{|X|>b\}\cup\{|X_n-X|>1\}$, and therefore (taking $a:=b+1$) $$ P(|X_n|>a)\le P(|X|>b)+P(|X_n-X|>1)<{\epsilon\over 2}+{\epsilon\over 2}=\epsilon, $$ provided $n\ge N$. Increasing $a$ if necessary, one can arrange that we also have $P(X_n|>a)<\epsilon$ for $n=1,2,\ldots, N-1$. This increase only make $P(|X_n|>a)$ smaller for $n\ge N$, so we're done.