Let $X,Y,X_n,Y_n: \Omega \rightarrow \mathbb{R}, n \ge 1$ be random variables.
- If for all $n\ge1$, $X_n$ and $Y_n$ are independent and if $(X_n,Y_n) \xrightarrow[]{d}(X,Y)$, then X and Y are independent.
I found that if two random variables are independent then for $(\xi_1,\xi_2) \in\mathbb{R}^2$ $$\mathbb{E}(e^{i\langle (\xi_1,\xi_2), (X_n,Y_n)\rangle})=\mathbb{E}(e^{i\langle\xi_1, X_n\rangle})\cdot\mathbb{E}(e^{i\langle\xi_2, Y_n\rangle})\\ \implies \lim_{n\rightarrow \infty}\mathbb{E}(e^{i\langle (\xi_1,\xi_2), (X_n,Y_n)\rangle})=\lim_{n\rightarrow \infty}\mathbb{E}(e^{i\langle\xi_1, X_n\rangle})\cdot\mathbb{E}(e^{i\langle\xi_2, Y_n\rangle})=\mathbb{E}(e^{i\langle (\xi_1,\xi_2), (X,Y)\rangle})$$ But since it isn't given that $X_n,Y_n$ are weakly convergent then how do I proceed?
It feels like I am missing something very trivial, I am trying to read Brownian Motion An Introduction to Stochastic Processes by Schilling. It feels like there is a lot that I already need to know before tackling this book. I have had a basic course in Stochastic Processes, can someone recommend a book to bridge the gap.
$(X_n,Y_n)$ converging to $(X,Y)$ in distribution implies that $X_n$ converges to X in distribution and $Y_n$ converges to Y in distribution so your argument using characteristic functions is correct; it is also the best way to prove the result.