MLE and method of moments estimator (example)

1k Views Asked by At

Let $X_1,X_2,\dots ,X_n$ be a random sample from the Gamma distribution

$$ f(x,\theta)=\theta^2 x e^{-\theta x},\quad x>0$$

To find the Maximum Likelihood Estimator, we define the likelihood function as: $$ L(\theta;x_i)=\prod_{i=1}^nf(x_i;\theta)$$ and we demand $$\frac{\partial \ln[L(\theta;x_i)]}{\partial\theta}=0$$ For this example $$L(\theta;x_i)=\theta^{2n}\cdot \prod_{i=1}^n x_i\cdot e^{-\theta \sum_{i=1}^nx_i}$$ $$\ln[L(\theta;x_i)]=2n\ln(\theta)+n\ln\bigg[\prod_{i=1}^nx_i\bigg]-\theta \sum_{i=1}^nx_i$$ So $$\frac{2n}{\theta}-\sum_{i=1}^n x_i=0\iff \hat{\theta}=\frac{2}{\bar{X}}$$ To find the estimator of $\theta$ using the method of moments $$\bar{X}=E(X)=\mu$$ $$E(X)=\int_0^\infty x f(x)dx=\theta^2\cdot\frac{2}{\theta^3}=\frac{2}{\theta}$$ $$\Rightarrow \tilde{\theta}=\frac{2}{\bar{X}}$$

Is there some kind of relation between the two? Why are the expressions of $\hat{\theta}$ and $\tilde{\theta}$ so similar?

1

There are 1 best solutions below

0
On BEST ANSWER

For this example

$$L(\theta;x_i)=\theta^{2n}\cdot \prod_{i=1}^n x_i\cdot e^{-\theta \sum_{i=1}^nx_i}$$

This is not right. We have $f(x)=\theta^2 x e^{-\theta x}$ Now we calculate the product for every $x_i$

$$ L(\theta;x_i)=\prod_{i=1}^n \theta^2 x_i\cdot e^{-\theta x_i}=\theta^{2n}\cdot \prod_{i=1}^n x_i\cdot e^{-\theta x_i}$$

You see that there is as yet no sigma sign involved. There is either an sigma sign or a product sign.

At the next step, taking logarithm, there is a mistake. It is right that the $\theta^{2n}$ becomes the summand $2n\cdot \ln(\theta)$. Now we calculate $\ln\left(\prod\limits_{i=1}^n x_i\cdot \large{e^{-\theta x_i}}\right)$

Firstly we use the logarithm rule $\log(a\cdot b)=\log(a)+\log(b)$ to eliminate the product sign.

$$= \sum_{i=1}^n \ln \left( x_i\cdot \large{e^{-\theta x_i}} \right)$$

We use the same rule again for a further simplification.

$$= \sum_{i=1}^n \ln \left( x_i \right) + \sum_{i=1}^n \ln\left( \large{e^{-\theta x_i}} \right)$$

$$= \sum_{i=1}^n \ln \left( x_i \right) -\theta \sum_{i=1}^n x_i$$

With the summnand $2n\cdot \ln (\theta)$ we have

$$\ln \left(L(\theta;x_i)\right)=2n\cdot \ln (\theta) +\sum_{i=1}^n \ln \left( x_i \right) -\theta \sum_{i=1}^n x_i$$

Now the derivative w.r.t. $\theta$ is

$$\frac{2n}{\theta}-\sum_{i=1}^n x_i=0$$

For the rest there are no logarithm rules required.