It is well known that for data described with a linear model plus Gaussian noise, the sum of the residuals (i.e., $-2 \ln \mathcal{L}$) is, for the maximum likelihood solution, $\chi^2$-distributed with $n-p$ degrees of freedom, where $n$ is the number of independent data points and $p$, the number of fitting parameters. In particular, the expected value of $-2 \ln \mathcal{L}$ is equal to $n-p$. Is there a way to write an approximate expression for the expected value of $-2 \ln \mathcal{L}$ for the case in which the model is nonlinear in the parameters? That is, is it possible to write an approximate expression for the effective/generalised number of degrees of freedom? I was thinking of expanding the residuals with respect to $\hat{\mathbf{\theta}} - \mathbf{\theta}_{\mathrm{true}}$, where $\hat{\mathbf{\theta}}$ is the maximum likelihood estimate of the parameters $\theta$, along the lines of Cox & Snell (1968) (https://www.jstor.org/stable/2984505). They provide an expression for the expected value of the square of each residual (Eq. 26), from which computing the expected value of $-2 \ln \mathcal{L}$ is of course trivial. However, to the order they consider, for the case in which the noise is additive and Gaussian their formula reduces to just $<-2 \ln \mathcal{L} >= n-p$. I assume going to the next order would work? It starts to get messy though - does anyone know if that has already been worked out?
Thank you.