I am quite confused on the concept of independence of random Variables.
Everything seemed to be fine until random vectors were introduced. In our lecture notes is says
$\vec{X}=(X_1,...,X_n)$ is a random gaussian vector iff all linear combinations of its entries are gaussian.
So far so good. We also know, that when the random Variables $X$ and $Y$ are both gaussian and independent, then $X+Y$ is also normal. With that one could suggest, that for a random vector to be gaussian, all its entries must be gaussian random variables AND independent or else a linear combination might not be gaussian. So $X_i \perp X_j$ must hold (at least pairwise).
It is also clear to me, that Cov$(X,Y)=0$ does not imply that X and Y are independent.
However later in the lecture we had a proposition as follows:
Let $\vec{X}=(X_1,...X_n)$ be a random vector. The Components of $\vec{X}$ are independent iff $\Sigma$ is a diagonal matrix.
That is confusing to me, because is this were true for any random vector, it should also hold for 2 dimensional ones $(X,Y)$, which would then imply, that Cov$(X,Y)=0$ is equivalent to $X\perp Y$, which is not true.
I also read that there is a destinction between mutually independent and pairwise independent, since however we did not have that destinction in classI did not search further in that direction.\