problem using MLE on gamma distributed variable

157 Views Asked by At

I am making some kind of systematic error(s), while working with maximum likelihood estimations. Could someone please point these out to me?

In my last assignment, I tried to find the MLE of $\beta$ for a gamma distributed variable.

As follows:

The density function is

$$f(x)=\frac{1}{\theta^2}ye^{(\frac{-y}{\theta})}$$

This can be separated into:

$$f(x)=\frac{1}{\theta^2}*\frac{y}{1}*e^{(\frac{-y}{\theta})}$$

The probability for th $n^{th}$ outcome can be written as

$$p(y)=(\frac{1}{\theta^2}*\frac{y_1}{1}*e^{(\frac{-y_1}{\theta})})*(\frac{1}{\theta^2}*\frac{y_2}{1}*e^{(\frac{-y_2}{\theta})})....(\frac{1}{\theta^2}*\frac{y_n}{1}*e^{(\frac{-y_n}{\theta})})$$

$$p(y)=(\frac{1}{\theta^2})^n*\prod (\frac{y_i}{1}*e^{(\frac{-y_i}{\theta})})$$

We convert this expression to logarithms:

$$p(y)=ln (\frac{n}{\theta^2}) + ln \sum {y_i} + ln {\frac{\sum -y_i}{\theta})}$$

We now take the partial derivtive, with respect to $\theta$

$$p´(y)= \frac {1}{\frac{n}{\theta^2}}*\frac {-2n}{\theta^3}+\frac{1}{ {\frac{\sum-y_i}{\theta}}}*\frac{\sum y_i}{\theta^2}$$

$$p´(y)=\frac{-2}{\theta}+\frac{1}{\theta}$$

Setting the derivative to 0 we get

$$\frac{-2}{\theta}=\frac{1}{\theta}$$

....which seems pretty disconcerting, if $\theta$ isn't approaching infinity.

The correct answer is supposed to be $\hat\theta =\frac{\bar y}{2}$

Where did I go astray?

Thankful for input/Magnus

1

There are 1 best solutions below

0
On BEST ANSWER

Your logarithmized expression is not right. You can use, that $y_i=e^{\ln(y_i)}$

$$p(y)=\left( \frac{1}{\theta ^2} \right)^n\cdot \prod \limits_{i=1}^n e^{\ln(y_i)-\frac{y_i}{\theta}}$$

Taking ln

$$\ln(p)=n \cdot \ln \left( \frac{1}{\theta ^2} \right)+\sum \limits_{i=1}^n y_i-\frac{1}{\theta}\sum \limits_{i=1}^n y_i$$


Here you have to be careful. It is $\ln\left[\left( \frac{1}{\theta ^2} \right)^n\right]=n \cdot \ln \left( \frac{1}{\theta ^2} \right)$, not $\ln \left( \frac{n}{\theta ^2} \right)$


$\ln(p)=-n \cdot ln \left( \theta ^2 \right)+\sum \limits_{i=1}^n \ln(y_i)-\frac{1}{\theta}\sum \limits_{i=1}^n y_i$

$\ln(p)=-2n \cdot \ln \left( \theta \right)+\sum \limits_{i=1}^n \ln(y_i)-\frac{1}{\theta}\sum \limits_{i=1}^n y_i$

The derivative w.r.t $\theta$ is

$-2\frac{n}{\theta}+\frac{1}{\theta ^2} \sum \limits_{i=1}^n y_i=0$

Multiplying the equation by $\theta^2$

$-2n\theta+\sum \limits_{i=1}^n y_i=0$

I think you can take it from here.