Find maximum likelihood estimator of $\hat{\theta}$ given the pdf $f(x;\theta)=\frac{1}{\theta^2}xe^{\frac{-x}{\theta}}$
I'm stuck on solving this one out. The answer is given as $\frac{\bar{X}}{2}$ in the answer sheet. I think I'm suppose to get to $\hat{\theta}=\frac{\left(\sum _{i=1}^nx_i\:\right)}{2n}=\frac{\bar{X}}{2}$. I'm having a hard time getting there though. I've done a few problems like this and I found the estimator successfully but this one is giving me a hard time. I think I'm making some computational error somewhere which is causing the problem.
Here are the equations I get when I solve them out: (I think I'm doing a computation wrong and it's causing issues maybe)
$L(\theta)=\frac{1}{\theta ^{2n}}\cdot e^{\sum _{i=1}^n-\frac{x_i}{\theta }\:}\cdot \left(x_1\cdot x_2\cdot .....\cdot x_n\right)$
$ln\left(L\left(\theta \right)\right)=ln\left(\frac{1}{\theta ^{2n}}\cdot e^{\sum _{i=1}^n-\frac{x_i}{\theta }\:}\cdot \left(x_1\cdot x_2\cdot .....\cdot x_n\right)\right)$
$=-2nln\left(\theta \right)-\sum _{i=1}^n\frac{x_i}{\theta }+ln\left(x_1\cdot x_2\cdot ....\cdot x_n\right)\:$
$\frac{d}{d\theta }\left(L\left(\theta \right)\right)=\frac{d}{d\theta }\left(-2nln\left(\theta \right)-\sum _{i=1}^n\frac{x_i}{\theta }+ln\left(x_1\cdot x_2\cdot ....\cdot x_n\right)\right)\:=0$
$\frac{d}{d\theta \:}\left(L\left(\theta \right)\right)=-\frac{2n}{\theta }-\frac{d}{d\theta }\left(\sum _{i=1}^n\:\frac{x_i}{\theta }\right)+\frac{1}{\left(x_1\cdot x_2\cdot ...\cdot x_n\right)}=0$
$-\frac{2n}{\theta }-\left(\sum _{i=1}^n\:\frac{x_i}{\theta }\right)+\frac{1}{\left(x_1\cdot \:x_2\cdot \:...\cdot \:x_n\right)}=0$
$-\frac{2n}{\theta }-\left(\sum _{i=1}^n\:\frac{x_i}{\theta }\right)=-\frac{1}{\left(x_1\cdot \:x_2\cdot \:...\cdot \:x_n\right)}$
$-\frac{1}{\theta }\left(2n+\sum \:_{i=1}^n\:x_i\right)=-\frac{1}{\left(x_1\cdot \:x_2\cdot \:...\cdot \:x_n\right)}$
$\left(2n+\sum \:_{i=1}^n\:x_i\right)=\frac{\theta }{\left(x_1\cdot \:x_2\cdot \:...\cdot \:x_n\right)}$
$\left(2n+\sum \:_{i=1}^n\:x_i\right)\left(x_1\cdot \:\:x_2\cdot ...\cdot \:\:x_n\right)=\theta $ And this is not what I'm suppose to get.
I'm a bit out of practice with calculus so I'm not really sure where the error is and how to fix it.
The log-likelihood is $$\ell (\theta ) = -2n \log \theta - \frac{1}{\theta} n \overline{X}+ \sum_{i=1}^n \log (x_i)$$ Notice that the last term does not depend on $\theta$, so its derivative with respect to $\theta$ is zero. Differentiating $\ell$ we get: $$\ell^\prime (\theta) = \frac{-2n}{\theta} + \frac{n}{\theta^2} \overline{X}$$ Setting $\ell^\prime (\hat{\theta}) =0$, we get $\hat{\theta} = \frac{\overline{X}}{2}$, as desired.