Random variables $X_n\xrightarrow{p}X, Y_n\xrightarrow{d}Y \implies X_nY_n \xrightarrow{d}XY$

902 Views Asked by At

Suppose we have $X_n\xrightarrow{p}X$ and $Y_n\xrightarrow{d}Y$. I would like to show that the product $X_nY_n$ converges in distribution to $XY$.

I'm trying to decompose and bound the sequence $\mathbb{P}(X_nY_n)$ by sequences converging to $\Bbb P(XY<z)$.

I'm thinking about something similar to $$\mathbb{P}(XY<z-\delta)-\mathbb{P}(|X_n-X|>\delta)+?\leq\mathbb{P}(X_nY_n<z)\leq \mathbb{P}(XY<z+\delta)+\mathbb{P}(|X_n-X|>\delta)+?$$ but I can't see how to go about it exactly. Some hints would be much appreciated.

1

There are 1 best solutions below

0
On

Forgetting for a while the convergence by letting $X_n$ independent of $n$ and $Y_n$ identically distributed, the question reduces to the following: if $Y\overset{\mbox{law}}{=}Y'$, are $XY$ and $XY'$ equal in distribution? As suggested by NCh, the answer is no: if $X$ takes the values $0$ and $1$ with probability $1/2$, $Y=X$ and $Y'=1-X$, then $XY=X^2$ has the same law as $X$ but $XY'=X(1-X)=0$.

In the context of the opening post, the best we can say is that the sequence $\left(X_nY_n\right)_{n\geqslant 1}$ is tight hence admits a convergent subsequence.