I'm taking a probability class and my prof used the following theorem IIRC.
Let $g\sim\mathcal{N}(\mu,\Sigma)$ where $\Sigma$ is diagonal( I don't know if this condition is necessary) and $\langle u,v\rangle=0$, then $\langle g,u\rangle$ and $\langle g,v\rangle$ are independent.
Is this correct? If so, how to prove this? I believe the following is a special case of the theorem: Are the random variables $X + Y$ and $X - Y$ independent if $X, Y$ are distributed normal? I tried to use the same technique to prove the theorem but got stuck.
Elaborating on Minus One-Twelfth's comment:
The pair $(g^\top u, g^\top v)$ is jointly normal. (Why?)
Thus independence is equivalent to $\text{Cov}(g^\top u, g^\top v) = 0$. The covariance is $$\begin{align}\text{Cov}(g^\top u, g^\top v) &= E[(g^\top u - E[g^\top u])(g^\top v - E[g^\top v])] \\ &= E[((g - \mu)^\top u)((g - \mu)^\top v)] \\ &= E[u^\top (g - \mu)(g-\mu)^\top v] \\ &= u^\top E[(g-\mu)(g-\mu)^\top] v \\ &= u^\top \Sigma v. \end{align}$$
If $\Sigma$ is a multiple of the identity matrix and if $\mu = 0$, then independence is equivalent to $u^\top v = 0$. However, in general you have to use the above expression $u^\top \Sigma v$.