Showing convergence of joint random variables

114 Views Asked by At

Let $X_n\rightarrow X, Y_n \rightarrow c$ in distribution where $c$ is a constant. I want to show that $(X_n, Y_n) \rightarrow(X,c)$ in distribution.

My attempt:

By Skorohod's Theorem, there exist $X_n',X',Y_n'$ such that $X_n'\rightarrow X', Y_n'\rightarrow c$ a.s. and $X_n'=^dX_n, X'=^dX, Y_n'=^dY$.

Since $(X_n',Y_n')\rightarrow (X',c)$ almost surely and $(X,c)=^d(X',c)$, we get the conclusion.

Is this proof legitimate? Also, is there any good proof for this question?

Thank you in advance. Any comment would be appreciated!

2

There are 2 best solutions below

0
On BEST ANSWER

you have \begin{align*} \| ( X_{n} , Y_{n} ) - ( X_{n} , a ) \| & \le | X_{n} - X_{n} | + | X_{n} - a | \xrightarrow[]{\mathbb{P}} 0 , \end{align*} since convergence in distribution to a constant implies stochastic convergence to the constant. Further, \begin{align*} \int f(X_{n} , a) \, d \mathbb{P} \xrightarrow[]{n \to \infty} \int f(X , a) \, d \mathbb{P} \qquad \text{ for all } f \text{ bounded and continuous on } \mathbb{R}^{2}, \end{align*} since $ f(., a) $ is bounded and continuous on $ \mathbb{R} $. This implies $ ( X_{n} , a ) \stackrel{d}{\to} ( X , a ) $. Now, the result follows from Slutzky's theorem.

0
On

It is enough to show that $\int e^{itX_n +is Y_n} dP \to (\int e^{itX} dP)e^{isc}$ for all $t,s \in \mathbb R$. Consider $\int \{e^{itX_n +is Y_n}-e^{itX}e^{isc}\} dP $. Let us show that $|\int e^{itX_n+s Y_n}-e^{itX_n}e^{isc} dP| \to 0 $ and $|\int e^{itX_n} e^{is c}-e^{itX}e^{isc} dP| \to 0 $. Since $|e^{itX_n}| \leq 1$ the first statement is follows. [You can use the version of Dominated Convergence Theorem where we have convergence in measure, rather than a.e. convergence in the hypothesis]. The second statement follows from convergence of $\{X_n\}$ to $X$.