I know how to use MLE method and method of moment to estimate the parameter(s) of a given distribution. For example, for the parameter $\theta$ of a population X $\sim$ $Ber(\theta)$, MLE of $\theta$ is $\hat{\theta} = \sum_i \frac{x_i}{n}$
However, it is just the estimation of parameters of the distribution. What if I want to estimate the variance of the distribution. What should I do? I know that it is possible to use the unbiased estimator of variance that we learnt in basic statistics course, but is there any universally accepted procedure for doing it after computing the MLE estimator?
For example, for the population X $\sim$ $Ber(\theta)$, $V(X) = \theta(1 - \theta)$. MLE of $\theta$ is $\hat{\theta} = \sum_i \frac{x_i}{n}$. I can replace $\theta$ by $\hat{\theta}$ to estimate the variance, and the estimator will surely be biased. I want to know if I can do the same thing for other situations just by replacing the parameters with MLE of the parameters?
Thanks a lot!!!!
The answer to you question is yes. This is referred to as the invariance property of MLEs. We have the following theorem taken from Casella & Berger's Statistical Inference (2nd edition,Theorem 7.2.10, pg.320).
Theorem 7.2.10. (Invariance property of MLEs)
If $\hat\theta$ is the MLE of $\theta$, then for any function $\tau(\theta)$, the MLE of $\tau(\theta)$ is $\tau(\hat\theta)$.
So in your example, if $\hat\theta$ is the MLE of the parameter $\theta$ from the Bernoulli distribution, then $\tau(\hat\theta)=\hat\theta(1-\hat\theta)$ is MLE of the variance $\tau(\theta)=\theta(1-\theta)$. In general, the MLE of a function like this will be biased.