Hello I have some perplexities about independence of multivariate gaussian distributions: before getting there, I have studied the following results:
1)Let $m$ r.v $X_1, \dots , X_m$ which range in $\mathbb{R}^{d_1} , \dots , \mathbb{R}^{d_m}$ respectively. They are called independent if for every $A_1 \subset \mathbb{R}^{d_1}, \dots , A_m \subset \mathbb{R}^{d_m}$ we have
$$P(X_1 \in A_1 , \dots , X_m \in A_m) = P(X_1 \in A_1) \dots P(X_m \in A_m)$$
Also, if $\phi_1 : \mathbb{R}^{d_1} \rightarrow \mathbb{R} , \dots, \phi_m:\mathbb{R}^{d_m} \rightarrow \mathbb{R}$ respect some regularity conditions, then $\phi_1(X_1), \dots , \phi_m(X_m)$ are independent.
Now, let $X_1, X_2, X_3$ independent r.v $\mathcal{N}(0,1)$ and consider
$$U = 2X_1 - X_2 - X_3 \qquad V = X_1 + X_2 + X_3, \qquad W = X_1 -3X_2 +2X_3$$
What can I say about the independence of r.v $U,V,W$?
The just stated result would bring me to think that these random variables $(U,V,W)$ are all independent (since $\phi$ are here simple linear transformations)..
Of course $X = (X_1, X_2, X_3) \sim \mathcal{N}(0,I)$ where $I$ is the identity $3 \times 3$ matrix. Then $U, V, W$ are jointly gaussian because linear transformations of jointly gaussian, so to check the independence is it sufficient to prove they are not correlated. But it results $Cov(U,W) \neq 0$ so $U,W$ seem not to be independent. Why does the initial statement fail here? Thanks
EDIT:
I guess we would have here $(d_1 = d_2 = d_3 = 3)$, $\phi_1 : \mathbb{R}^3 \rightarrow \mathbb{R}$ such that $(X_1, X_2, X_3) \mapsto 2X_1 - X_2 - X_3$, $\phi_2 : \mathbb{R}^3 \rightarrow \mathbb{R}$ such that $(X_1, X_2, X_3) \mapsto X_1 + X_2 + X_3$ and last one $\phi_3 : \mathbb{R}^3 \rightarrow \mathbb{R}$ such that $(X_1, X_2, X_3) \mapsto X_1 - 3X_2 + 2X_3$.
So actually, the initial result just tells me that $\phi_1, \phi_2, \phi_3$ are independent but $U = \phi_1$ and $W = \phi_3$ are not..why?
In order to understand better the concept of independence I would like to consider an example as follows:
suppose that $X_1,X_2,X_3$ are iid $N(\theta;1)$
$V=X_1+X_2+X_3$ is a Complete and Sufficient Statistics (CSS) for $\theta$ while
$U\sim N(0;6)$ and $W\sim N(0;14)$
This means that U and V are Ancillary for $\theta$ and thus, calling upon Basu's Theorem $U,V$ and $V,W$ are independent