My supervisor have given me the following, which I'm trying to understand:
Let $X$ be a random vector with an associated probability density: \begin{equation} X \sim \exp (-\alpha XL X^{\top}), \end{equation} where $L$ is a symmetric positive semi-definite matrix (Graph-Laplacian) and $\alpha$ is a scalar. I now need to show that the covariance matrix of $X$ is: $K \sim (\alpha L)^{-1}$. Now I'm not sure exactly how to do this for vectors. The covariance matrix is defined as: \begin{equation} K_{XX} = E[XX^\top] - \mu_X \mu_X^\top \end{equation} Now both the mean values should be zero as far as I can see, since we are dealing and odd function integrated over a symmetric domain around 0. Hence we get: \begin{align*} K_{XX} &= E[XX^\top] \\ &\sim \int_{-\infty}^\infty XX^\top \exp (-\alpha XL X^{\top}) dX \end{align*} At this point I'm not exactly sure how to proceed... If they were individual scalar variables $x$, I could continue like this: \begin{align*} K_{xx} &= E[xx^\top] \\ &\sim \int_{-\infty}^\infty xx^\top \exp (-\alpha xL x^{\top}) dx \\ &= \int_{-\infty}^\infty x^2 \exp (-\alpha L x^2) dx \\ &\sim (\alpha L )^{-3/2} \\ \end{align*} But this doesn't seem to give the right solution, and now I'm wondering whether this is because it gives something else in the matrix case, or whether this is just because my supervisor have made a mistake in the initial probability density, and it instead should have been: \begin{equation} X \sim \exp (-\alpha^{2/3} XL^{2/3} X^{\top}) \end{equation}