Some properties of convergence in distribution for 2-Dimensional Randomvariables

60 Views Asked by At

I have this exercise where I want to check my solutions:

Let $X_n = (X_{n,1}, X_{n,2}), X = (X_1, X_2)$ be two-dimensional random vectors with $X_n \overset{d}{\rightarrow} X$. Show:

(a) It holds that $X_{n,i}\overset{d}{\rightarrow} X_i$ for $i \in {1, 2}$.

(b) It holds that $aX_{n,1} + bX_{n,2} \overset{d}{\rightarrow} aX_1 + bX_2$ for $a, b \in \mathbb{R}$, and $X_{n,1}X_{n,2} \overset{d}{\rightarrow} X_1X_2$.

(c) Provide an example demonstrating that $X_{n,i}\overset{d}{\rightarrow} X_i$ for $i \in {1, 2}$ is generally not sufficient to imply (b).

To prove (a), we use the Continuous Mapping Theorem. Assuming $X_n \overset{d}{\rightarrow} X$, it follows that any continuous function of $X_n$ converges to the corresponding function of $X$. Since the projection functions $p_i: \mathbb{R}^2 \rightarrow \mathbb{R}$ defined by $p_i(x_1, x_2) = x_i$ are continuous, we have $p_i(X_n) \overset{d}{\rightarrow} p_i(X)$ for $i \in {1, 2}$. This means that the components $X_{n,i}$ converge to $X_i$.

For (b), we consider the linear combination $Y_n = aX_{n,1} + bX_{n,2}$ with $a, b \in \mathbb{R}$. By the Linear Transformation Theorem, if $X_n \overset{d}{\rightarrow} X$, then $Y_n$ converges to $aX_1 + bX_2$.

For the product $Z_n = X_{n,1}X_{n,2}$, we can apply the Continuous Mapping Theorem again since the multiplication function $f(x_1, x_2) = x_1x_2$ is continuous. Therefore, we have $Z_n = X_{n,1}X_{n,2} \overset{d}{\rightarrow} X_1X_2$.

So far, have I done any mistakes?

For c) I really dont know how to proceed. Can someone help me?