Context: 2nd year university statistics course textbook question
So I had to find two estimators (using method-of-moments and maximum likelihood estimation) of $\theta$ for a random sample $X_1, ..., X_n$ from a population with pmf $f(X=x)=\theta^x(1-\theta)^{1-x}$ for $x=0$ or $x=1$ where $\theta \in [0, 0.5]$ is a model parameter. I recognise this is a Bernoulli distribution.
I found that both methods gave the same estimator $T=\frac{1}{n} \sum^n_{i=1}X_i$ (the sample mean). The next part of the question required me to find the mean squared error of the two estimators. I have a couple questions:
- Since the estimators are the same, does this mean their mean squared erors will be too?
- How should I go about calculating the mean squared error? I know $MSE(T) = Var(T)+[Bias(T)]^2$, but for the $Var(T)$ component I don't know how to calculate $E(T^2)$. Or would it be better to calculate it via $E[(T-\theta)^2]$?
Thanks
EDIT
The two estimator are not the same.
I do not know if the exercise asks you to find analytically the two MSE's, but if $\overline{X}_n\leq\frac{1}{2}$ the two MSE's are the same, and equal to the sample means' variance: $\frac{\theta(1-\theta)}{n}$. On the contrary, if $\overline{X}_n>\frac{1}{2}$ the first estimator does not make sense.
Restricted MLE
in this example, Likelihood's domain is restricted in $\theta \in[0;0.5]$ so it is self evident that if $\overline{X}>0.5$ the likelihood is strictly increasing and its argmax is on the border: $\hat{\theta}_{ML}=0.5$
Let's look at the following example:
Let's draw an unfair coin 10 times. Suppose we have the two following cases
3 Successes on 10 Draws
7 Successed on 10 Draws
the two likelihoods are the following
EDIT2:
Let's have a focus on the MSE(ML)
This changes if the estimator "sample mean" is greater than 0.5 or not.
$$\mathbb{V}[\overline{X}_n]=\frac{1}{n^2}n\mathbb{V}[X_1]=\frac{\theta(1-\theta)}{n}$$