I use MATLAB's fminunc function in order to find the minimum of a negative log-likelihood function $f(\overrightarrow{\theta})$, parametrized by 3 parameters lets say $\overrightarrow{\theta}=(\beta,\alpha,w)$. Before inference I transform $\beta$ under an exponential transformation and the other two parameters under a sigmoid transformation $1/(1+exp(-x))$. I apply the transformations to ensure my parameters satisfy certain constraints ($\beta>0$ and $0\leq\alpha,w\leq1$).
After the inference I apply again the transformations in order to get the $\hat{\beta} ,\hat{\alpha},\hat{w}$.
My question is how do I calculate the variance of each of the inferred parameters?
I know that without the transformations I can use the inverse of the Hessian (that can be estimated by fminunc) but with the transformations I have found only the Delta method. But according to this, only one common transformation is used at the random vector (e.g I have random vector $\overrightarrow{x}$ and apply a tansformation $g(\overrightarrow{x})$ ).
Thanks a lot!
Under the assumption that your parameters are independent and that the applied transformation is continuous. $$\hat x= g(x)$$ Parameters variance-covariance matrix is: $$\Sigma = \begin{bmatrix} \sigma_{x_1}^2 & 0 &0 \\ 0& \sigma_{x_2}^2&0\\0&0&\sigma_{x_2}^2\end{bmatrix}$$
Then calculate the Jacobian of $g$: $$J = \begin{bmatrix} \frac{\partial g_1(x)}{\partial x_1} & \frac{\partial g_1(x)}{\partial x_2} & \frac{\partial g_1(x)}{\partial x_3} \\ \frac{\partial g_2(x)}{\partial x_1} & \frac{\partial g_2(x)}{\partial x_2} & \frac{\partial g_2(x)}{\partial x_3} \\\frac{\partial g_3(x)}{\partial x_1} & \frac{\partial g_3(x)}{\partial x_2} & \frac{\partial g_3(x)}{\partial x_3}\end{bmatrix}$$
And then you can calculate the variance-covariance matrix of the transformed parameters with: $$\hat\Sigma = J\Sigma J^T$$