I've been studying for my statistics exam with older tests that I found in various places. One of the questions I have found relating to the computation of the maximum likelihood estimator is confusing me. Unfortunately only the correct answer is provided, and not the actual steps. I have reproduced the question below:
Suppose $ X_1, X_2, ..., X_n $ are independent and identically distributed random variables with density function $$ f(x) = \frac{1}{2\sigma}exp(-\frac{|x|}{\sigma}) $$
For $ x \in \mathbb{R} $, where $ \sigma > 0 $. What is he maximum likelihood estimator of $ \sigma $?
The answer provided with the test is $$ \frac{\sum^n_{i=1}|X_i|}{n} $$
My first question would be, why does this function not depend on $ \sigma $? Aren't you supposed to compute the maximum using the derivative so that you can determine the value of sigma?
If this answer is actually correct, my second question would be how do you get to the answer? I have reproduced my attempt below, although it didn't get very far. Since we have a continuous distribution:
$$ L(\sigma) = f(x_1) f(x_2) ... f(x_n) $$ $$ L(\sigma) = \frac{1}{2\sigma} e^{-\frac{|x_1|}{\sigma}} \frac{1}{2\sigma} e^{-\frac{|x_2|}{\sigma}} ... \frac{1}{2\sigma} e^{-\frac{|x_n|}{\sigma}} $$ $$ L(\sigma) = (\frac{1}{2\sigma})^n \prod^n_{i=1} e^{-\frac{|x_n|}{\sigma}} $$
Then, computing the log-likelihood: $$ \ell(\sigma) = ln((\frac{1}{2\sigma})^n) \sum^n_{i=1} -\frac{|x_n|}{\sigma} $$ $$ \ell(\sigma) = n~ln(\frac{1}{2\sigma}) \sum^n_{i=1} -\frac{|x_n|}{\sigma} $$
Where would I go from here? Right now it does not seem that differentiating would make the $ \sigma $ disappear.
The likelihood function given the sample $(x_1,x_2,\ldots,x_n)\in\mathbb R^n$ is
$$L(\sigma)=\prod_{k=1}^n \left(\frac{1}{2\sigma}e^{-|x_k|/\sigma}\right)=\frac{1}{(2\sigma)^n}\exp\left(-\frac{1}{\sigma}\sum_{k=1}^n |x_k|\right)\quad,\,\sigma>0$$
The log-likelihood is
$$\ell(\sigma)=-n\ln(2\sigma)-\frac{1}{\sigma}\sum_{k=1}^n |x_k|$$
Differentiating you get,
$$\ell'(\sigma)=-\frac{n}{\sigma}+\frac{1}{\sigma^2}\sum_{k=1}^n |x_k|$$
Setting $\ell'(\sigma)=0$, you have $$\hat\sigma=\frac{1}{n}\sum_{k=1}^n |x_k|$$
Verify that the likelihood is maximized at $\hat\sigma$ by showing $\ell''(\hat\sigma)<0$.
This would prove that the maximum likelihood estimator of $\sigma$ is $$\hat\sigma(X_1,\ldots,X_n)=\frac{1}{n}\sum_{k=1}^n |X_k|$$
You are deriving an estimator of the parameter $\sigma$. How can an estimator of a certain parameter be dependent on the very parameter you are estimating ? Moreover, an estimator is a statistic, which by definition is a function of sample observations independent of the parameter of interest.
By the method of maximum likelihood, you get an estimate $\hat\sigma$ of $\sigma$ based on the observed value of the sample $(x_1,\ldots,x_n)$.