Is it right to say that vectors $A_1 X$ and $A_2 X$ are independent iff $A_1A_2^\top=0$?

96 Views Asked by At

I am new to multivariate normals, so this may seem trivial. Let $X$ be a vector of independent, identically distributed normal random variables. I think that $A_1 X$ and $A_2 X$ are independent iff $A_1A_2^\top=0$, for the following reason:

Since $X$'s elements are iid, we can write $X \sim \mathcal{N}[\mu, \sigma^2 I]$, and so

$$ \begin{align} \mathrm{Cov}(A_1 X, A_2 X) &= \mathrm{E}[(A_1 X-A_1 \mu)(A_2 X-A_2 \mu)^\top] \\ &= A_1\mathrm{E}[(X-\mu)(X-\mu)^\top]A_2^\top \\ &=A_1A_2^\top\sigma^2, \end{align} $$

which equals $0$ if and only if $A_1A_2^\top=0$, iff $A_1 X, A_2 X$ are independent.

Is this proof ok?

2

There are 2 best solutions below

2
On BEST ANSWER

The calculation itself is correct. But so far, it only shows that $A_1A_2^\top=0$ is equivalent to $A_1X$ and $A_2X$ being uncorrelated which, in general, is weaker than being independent. In order to deduce the desired property, you still need the argument that $A_1X$ and $A_2X$ again have a normal distribution and that normally distributed random variables are independent if and only if they are uncorrelated.

4
On

In the general case, you have

$X$ and $Y$ are independent $\implies$ $\mathrm{Cov}(X,Y)=0$

but the reverse is not true in general.

Let's take $X$ a random variable with $E(X)=0$ and $E(X^2)<\infty$. Let $Y$ be a random variable that can have value $1$ or $-1$ with probability $1/2$. Then we consider $Z=XY$. Obviously, $Z$ and $X$ are not independent.

But: $$\mathrm{Cov}(Z,X) = E(X^2Y) - E(X)E(Y) = E(X^2Y)$$ As $X$ and $Y$ are independent, $$E(X^2Y) = E(X^2)E(Y) = 0 = \mathrm{Cov}(Z,X)$$