Parameters estimation of a normal distribution

185 Views Asked by At

Given a statistical sample $X_1,\dots ,X_n$ from a population with density $\mathcal N \left(\mu, \sigma^2\right)$, supposing that we know the value of $\sigma^2$, I must calculate Cramer-Rao bound, find a correct estimator, and determine if this estimator is unbiased or consistent. Then, supposing that $\sigma^2$ is unknown, I must find for $\left(\mu, \sigma^2\right)$ one estimator with the method of moments, and one with maximum likelihood.

Actually most of the exercise is clear to me: I won't write here the calculations, but Cramer-Rao bound is $\frac {\sigma^2} n$ and the unbiased estimator could be the sample mean, that is also consistent since it is known to be asymptotically normal. The sample mean (call it $T_n $) is even an estimator for $\mu$ with moments method, and we can use $S_n:= \frac 1 n\sum_i X_i^2 -T_n $ to estimate $\sigma^2$ again with the moments. However I have no idea on how to find an estimator with maximum likelihood: I tried to find the maximum of $$ \prod_i \frac {1}{\sqrt { 2\pi\sigma^2}} \exp \left(-\frac{(X_i-\mu)^2}{2\sigma^2}\right), $$ but it seems not the right way to me. (Actually I tried to find the maximum of the logarithm of that product, since it should be easier). Any suggestions? Thanks a lot

2

There are 2 best solutions below

0
On

So you are correct, and we have the likelihood function $$ \mathcal{L}(\mu, \sigma) = \prod_{i=1}^n \frac{\exp\left(-\frac{(X_i-\mu)^2}{2\sigma^2}\right)}{\sigma \sqrt{2\pi}} = \frac{\exp\left(\frac{-1}{2\sigma^2} \sum_{i=1}^n (X_i-\mu)^2\right)} {\sigma^n \left(\sqrt{2\pi}\right)^n} $$ and so you would like to maximize $$ \ln\left( \mathcal{L}(\mu, \sigma) \left(\sqrt{2\pi}\right)^n \right) = \frac{-1}{2\sigma^2} \sum_{i=1}^n (X_i-\mu)^2 - n \ln \sigma. $$ Can you now finish this 2-variable optimization problem?

0
On

I think it is easier to see the likelihood in the following way...and remember that different likelihoods are equivalent unless a multiplicative constant; so you can discard any quantity non-depending on the parameters

$L(\mu;\sigma^2)\propto(\sigma^2)^{-\frac{n}{2}}Exp[-\frac{1}{2\sigma^2}\sum_i(X_i-\mu)^2]$. First estimate $\mu$ finding $\hat{\mu}=\bar{X}$

Now your likelihood becomes

$L(\mu;\sigma^2)\propto(\sigma^2)^{-\frac{n}{2}}Exp[-\frac{1}{2\sigma^2}\sum_i(X_i-\bar{X})^2]$.

Take the logarithm...

$l(\sigma^2)=-\frac{n}{2}log(\sigma^2)-\frac{1}{2\sigma^2}\sum_i(X_i-\bar{X})^2$

derive with respect to $\sigma^2$....

$\frac{\partial l}{\partial{\sigma^2}}=-\frac{n}{2\sigma^2}+\frac{1}{2\sigma^4}\sum_i(X_i-\bar{X})^2=0$

and get

$\hat{\sigma^2}=S^2$