Given a multivariate normal distribution $X \sim N(\beta, \Sigma)$, where $\Sigma$ is positive definite, how would I go about finding the joint distribution of $y_i=-exp(-x_i)$? It is straightforward for the univariate case using the change of variables formula, but I'm not sure whether I'm applying it correctly for the multivariate case.
The case I need this for is the following: I have a multivariate normal prior over two random variables, and have updated my beliefs following a number of draws. The distribution of $X$ is my posterior distribution. I now want to find the distribution of my utility $u(x)$, where $u(x)=-\exp(-x)$.
As example, consider the bivariate case:
$$ \begin{pmatrix} y_1 \\ y_2 \end{pmatrix} = g\begin{pmatrix} x_1 \\ x_2 \end{pmatrix}= \begin{pmatrix} -\exp(-x_1) \\ -\exp(-x_2) \end{pmatrix} $$
The function $g$ has domain $(-\infty, \infty)$ and range $(-\infty,0)$ so it should be applicable.
The inverse function $g^{-1}$ would then be $-\log(-y_i)$ and the determinant of the Jacobian of this inverse matrix $$J=\begin{pmatrix}-1/y_1 & 0 \\ 0 & -1/y_2\end{pmatrix}=\begin{pmatrix}1/\exp(-x_1) & 0 \\ 0 & 1/\exp(-x_2)\end{pmatrix}$$
Would the resulting distribution simply be $$p(y(x))=\frac{\exp \left(-0.5(x-\mu)\Sigma^{-1}(x-\mu)\right)}{\sqrt{(2\pi)^2|\Sigma|}}\times |J|$$
Or am I going wrong somewhere?
The joint distribution function of $Y_i=-\exp(-X_i)$ is $$\tag{1} \mathbb P\Big\{Y_1\le y_1,...,Y_n\le y_n\Big\}=\mathbb P\Big\{X_1\le -\log(- y_1),...,X_n\le -\log(-y_n)\Big\}\,. $$ Write $\Phi(x_1,...,x_n)$ for the CDF of the $n$-dimensional normal distribution $N(\beta,\Sigma)$. The RHS of (1) is then $$ \Phi\Big(-\log(-y_1),...,-\log(-y_n)\Big)\,. $$ The PDF you are looking for is the obtained by differentiating w.r.t. each $y_i$. By the chain rule this is $$ \varphi\Big(-\log(-y_1),...,-\log(-y_n)\Big)\frac{(-1)^n}{y_1\cdot...\cdot y_n}\,,\quad y_i<0\,, $$ where $\varphi(x_1,...,x_n)$ is the PDF of $N(\beta,\Sigma)$.