In information geometry, the determinant of the Fisher information matrix is a natural volume form on a statistical manifold, so it has a nice geometrical interpretation.
But what is it in statistics? Does it measure anything meaningful? (For example, I would say that if it is zero, then the parameters are not independent. Does this go any further?)
Also, is there any closed form to compute it?
Thanks.
Update: I posted a similar question on stats.se.
If the log-likelihood is quadratic (i.e., the estimator is normally distributed), then the Fisher information is the reciprocal of the variance of the estimator (hence, the lower the variance of the estimator, the more "information" is provided by the data about the parameter...loosely speaking). The square root of the reciprocal of the Fisher Information is the standard error of an estimator (assuming approx. quadratic log-likelihood).
However, if the log-likelihood is not quadratic, then its reciprocal no longer represents the variance of the estimator. You either need to transform the estimator to give it a quadratic log-likelihood or use computational methods to determine the error.