I have to prove the following Theorem:
$X_1$, $X_2$, ... $X_n$ are Random Variables with the density function:
$f_X(x) = k \times exp\{-\frac{1}{2}(x-\mu)^TD(x-\mu)\}$
where $\mu \in R^n$ and $D$ is symmetric and positive-semidefinite.
Then $X$~$N_n(\mu, D^{-1})$.
We have been told that the square root of a diagonalizable matrix $D$ is $PD_1^{\frac{1}{2}}P^{-1}$ where $D = PD_1P^{-1}$.
However, the proof of this theorem (I believe) requires us to find a function $f(D)$ such that $\{f(D)\}^T\{f(D)\} = D$, which would be true if $f(D) = PD_1^{\frac{1}{2}}P^{-1}$ were symmetric, but I see no reason to believe that it would be. Is it possible to find a $P$ such that $PD_1^{\frac{1}{2}}P^{-1}$ is indeed symmetric, or is there some other fuction $f(D)$ that makes this possible?
Provided this was possible, we can write:
$Y = \mu + f(D^{-1})^{-1}Z$
where $Z$ is a vector of iid $N(0, 1)$ RVs.
which implies that:
$f_Y(y) = \frac{|f(D^{-1})|}{(2\pi)^{n/2}} \times exp\{\frac{1}{2} -(y-\mu)^Tf(D^{-1})^Tf(D^{-1})(y - \mu)\}$
which completes the proof?
For a real symmetric matrix, eigenvectors corresponding to distinct eigenvalues are orthogonal. It follows that $P$ can be chosen to be an orthogonal matrix, i.e., $P^T P = I$. Then we conclude that $P D_1^{1/2} P^{-1}$ is symmetric, as you wanted.