Uncertainty analysis in maximum likelihood estimation under constraint

47 Views Asked by At

I'm not from a statistical background so you might have to excuse me for my somewhat inaccurate (or even erroneous) phrasing, I'll try the phrase my problem as I understand it.

The maximum likelihood estimation (MLE) infer the most probable model parameter $X$ from a given set of data $D$. In my case, I am using the least square methods, where the MLE is then the minimization of a quadratic loss function $ L = \sum_i (D_i - f_i(X))^2$. I am also interested in the uncertainty of my estimation, which then can be calculated from the second derivative $\frac{\delta L^2}{\delta X_i \delta X_j}$, i.e. the Hessian matrix $H$.

Now, I also have some constraints on my model $X$, say they are $g(x) \geq 0$. And here comes my questions:

  1. In this case, is the MLE by least square estimation just a constrained optimization of $L$ under $g(x)$?
  2. If so, would it be correct to calculate the uncertainty of the constrained estimation from the bordered Hessian $H'$, which comes from the second derivative of the Lagrangian $\mathcal{L} = L + \lambda g(X)$? If so, how exactly?
  3. Finally, if 1 and 2 are true, does it mean that if my constraints are linear with respect to $X$, then the Hessian $H$ coincide with the bordered Hessian $H'$ and the uncertainty is not affected by the constraints?

Also I would appreciate it if anyone could provide my with some references in that direction. Since in the books I've read they mention MLE and least square method but then they do not discuss its connection into optimization and they do not mention the case with constraints.

Thanks in advance.