How to find the MLE of the parameters of an inverse Gaussian distribution?

8.1k Views Asked by At

The pdf is $(\frac{\lambda}{2\pi x^3})^\frac{1}{2}$exp$(\frac{-\lambda (x-\mu)^2}{2 \mu^2 x})$

I believe $\hat\mu$=$\bar X$ but I can't seem to find $\hat \lambda$

1

There are 1 best solutions below

4
On

Consider the simpler case of $X_1,...,X_n \sim \text{IID InvN}(\mu, \lambda)$, which gives you the sampling density:

$$f(\mathbf{x} | \mu, \lambda) = \prod_{i=1}^n \Big( \frac{\lambda}{2 \pi x_i^3} \Big)^{1/2} \exp \Big( - \sum_{i=1}^n \frac{\lambda (x_i - \mu)^2}{2 \mu^2 x_i} \Big) \quad \quad \text{for all } \mathbf{x} \geqslant \mathbf{0}.$$

The log-likelihood function is:

$$\ell_\mathbf{x}(\mu, \lambda) = \text{const} + \frac{n}{2} \ln(\lambda) - \sum_{i=1}^n \frac{\lambda (x_i - \mu)^2}{2 \mu^2 x_i}.$$

The partial derivatives are:

$$\begin{equation} \begin{aligned} \frac{\partial \ell_\mathbf{x}}{\partial \mu}(\mu, \lambda) &= \frac{n \lambda}{\mu^3} (\bar{x} - \mu), \\[10pt] \frac{\partial \ell_\mathbf{x}}{\partial \lambda}(\mu, \lambda) &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2 \mu^2} \sum_{i=1}^n \frac{(x_i - \mu)^2}{x_i}. \\[10pt] \end{aligned} \end{equation}$$

Setting the first of these partial derivatives to zero yields the mean MLE $\hat{\mu} = \bar{x}$ which gives the partially-optimised partial derivative:

$$\begin{equation} \begin{aligned} \frac{\partial \ell_\mathbf{x}}{\partial \lambda}(\hat{\mu}, \lambda) &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2 \bar{x}^2} \sum_{i=1}^n \frac{(x_i - \bar{x})^2}{x_i} \\[6pt] &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2 \bar{x}^2} \sum_{i=1}^n \Big( x_i - 2 \bar{x} + \frac{\bar{x}^2}{x_i} \Big) \\[6pt] &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2 \bar{x}^2} \Big( n \bar{x} - 2 n \bar{x} + \bar{x}^2 \sum_{i=1}^n \frac{1}{x_i} \Big) \\[6pt] &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2 \bar{x}^2} \Big( \bar{x}^2 \sum_{i=1}^n \frac{1}{x_i} - n \bar{x} \Big) \\[6pt] &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{1}{2} \Big( \sum_{i=1}^n \frac{1}{x_i} - \frac{n}{\bar{x}} \Big) \\[6pt] &= \frac{n}{2} \cdot \frac{1}{\lambda} - \frac{n}{2} \Big( \frac{1}{\tilde{x}} - \frac{1}{\bar{x}} \Big) \\[6pt] &= \frac{n}{2} \Big[ \frac{1}{\lambda} - \Big( \frac{1}{\tilde{x}} - \frac{1}{\bar{x}} \Big) \Big], \\[6pt] \end{aligned} \end{equation}$$

where $\tilde{x} \equiv n / \sum x_i^{-1}$ is the harmonic sample mean of $\mathbf{x}$. Setting this partial derivative to zero gives the estimator:

$$\frac{1}{\hat{\lambda}} = \frac{1}{\tilde{x}} - \frac{1}{\bar{x}}.$$

We confirm below$^\dagger$ that these critical points occur at a local maximum of the function. With a bit more work it can be shown that they are the global maximising values, and thus the MLEs.


$^\dagger$ Checking second-order conditions: At the critical points above we have Hessian matrix:

$$\nabla^2 \ell_\mathbf{x}(\hat{\mu},\hat{\lambda}) = \begin{bmatrix} - \frac{n}{\bar{x}^3} \hat{\lambda} & 0 \\ 0 & - \frac{n}{2} \cdot \hat{\lambda}^{-2} \end{bmatrix},$$

which is negative-definite. Hence, the above estimators are local maximums of the log-likelihood function.