I am new to multivariate normals, so this may seem trivial. Let $X$ be a vector of independent, identically distributed normal random variables. I think that $A_1 X$ and $A_2 X$ are independent iff $A_1A_2^\top=0$, for the following reason:
Since $X$'s elements are iid, we can write $X \sim \mathcal{N}[\mu, \sigma^2 I]$, and so
$$ \begin{align} \mathrm{Cov}(A_1 X, A_2 X) &= \mathrm{E}[(A_1 X-A_1 \mu)(A_2 X-A_2 \mu)^\top] \\ &= A_1\mathrm{E}[(X-\mu)(X-\mu)^\top]A_2^\top \\ &=A_1A_2^\top\sigma^2, \end{align} $$
which equals $0$ if and only if $A_1A_2^\top=0$, iff $A_1 X, A_2 X$ are independent.
Is this proof ok?
The calculation itself is correct. But so far, it only shows that $A_1A_2^\top=0$ is equivalent to $A_1X$ and $A_2X$ being uncorrelated which, in general, is weaker than being independent. In order to deduce the desired property, you still need the argument that $A_1X$ and $A_2X$ again have a normal distribution and that normally distributed random variables are independent if and only if they are uncorrelated.