I try to calculate the MLE of both parameters in the Gamma distribution. Let $X$ be $\Gamma(\gamma, \alpha)$ distributed.
Then the density function is given by $f(x) = \frac{\alpha^\gamma}{\Gamma(\gamma)} x^{\gamma -1} e^{-\alpha x}$
The Likelihood function is:
$L(x_1, \dots,x_n) = \prod_{i=1}^{n} f(x_i) = \prod_{i=1} \frac{\alpha^\gamma}{\Gamma(\gamma)} x_i^{\gamma -1} e^{-\alpha x_i} = (\frac{\alpha^\gamma}{\Gamma(\gamma)})^n \times x_i^{n (\gamma -1)} \times e^{-\alpha \sum_{i=1}^n x_i}$
The Log Likelihood function is:
$\log L(x_i,\dots,x_n) = n\gamma\log(\alpha) - n\log(\Gamma( \gamma)) + n(\gamma-1) \log(x_i) - \alpha \sum_{i=1} ^n x_i$
For $\alpha$ i calculate the estimator:
$\frac{\partial \log L}{\partial \alpha} = \frac{n \gamma}{\alpha} - \sum_{i=1} ^n x_i \overset{!}{=} 0 $
$\Rightarrow \frac{n \gamma}{\sum_{i=1} ^n x_i} = \alpha$
Is this the correct estimator for $\alpha$ ?
For $\gamma$ I get so far:
$\frac{\partial \log L}{\partial \gamma} = n \log(\alpha) - \frac{n \Gamma^{\prime}(\gamma)}{\Gamma(\gamma)} + n \log(x_i) \overset{!}{=} 0$
$\Rightarrow \frac{\Gamma^{\prime}(\gamma)}{\Gamma(\gamma)} = \log(\alpha) +\log(x_i) $
now I do not know how to calculate the estimator for $\gamma$. I know, it holds $\Gamma(\gamma) = (\gamma -1)!$ but I am not sure, how this fact could help me.
I am also pretty sure the derivation of $- n\log(\Gamma( \gamma))$ is wrong. I hope someone could help me.
I have another question, because I will have to deal with this one later as well. How do I show that the estimator (e.g. for $\alpha$) is biased or unbiased? Our definition for unbiased is: $ E_{\theta}[T] = \theta$. I think $T$ should represent the estimator.But does $\theta$ always have to come out? Or should there come out $\frac{\gamma}{\alpha}$ the expectation of a Gamma distributed rv?
Thanks in Advance!