Independence of Multivariate Normal Distribution

217 Views Asked by At

$\mathbf {Problem:}$ Let $X \sim N_n(μ, Σ)$. For column vectors $a$ and $b$ of length $n$, show that $a^T X$ and $b^T X$ are independent if and only if

$a^T Σb = 0$. Where $N_n(μ, Σ)$ denote $n$ dimensional multivariate normal distribution.

$\mathbf {My Attempt:}$ We know $a^TX ∼ N_n(a^Tμ, a^TΣa)$ and $b^TX \sim N_n(b^Tμ, b^TΣb)$. To show independence, I need to show that, the joint density of $Y=a^TX$ and $Z=b^TX$ is,

$$ f_{Y,Z}(y,z) = f_{Y}(y) \cdot f_Z(z).$$

But how to prove the above claim? Also I don't have any idea in which step, I have to use $a^T Σb = 0$. What about the converse? Any help would be much appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

$\newcommand{\Cov}{\operatorname{Cov}}\newcommand{\E}{\mathbb E}$ An outline using MGF.

Notice that $$M_X(v):=\E[\exp(v^TX)]=\exp\left(v^T\mu+\frac 1 2 v^T\Sigma v \right)$$ Moreover for $t\in\mathbb R$: $$M_{X}(t(a+b))=\exp\left(t(a+b)^T\mu+\frac 1 2 t^2(a+b)^T\Sigma (a+b) \right)$$ On the other hand: $$M_X(t(a+b)) =\E[\exp(t(a+b)^TX)] = \E[\exp(t(a^TX+b^TX))]=M_{a^TX+b^TX}(t) $$ A theorem about MGF says two random variables are independent iff the MGF of the sum is the product of the MGF. So you can question yourself when do we have: $$M_{a^TX+b^TX}(t)\stackrel{?}{=}M_{a^TX}(t)M_{b^TX}(t)$$ After doing that, you will get an iff statement about the independence of $a^TX$ and $b^TX$.