Let $X_1 , X_2 ,....,X_n$ be i.i.d random variables with $N(\mu,1)$ distribution. Assume that $\mu \in [0,\infty)$. Let $\hat{\mu}$ be the MLE of $\mu$, then which of the following statements are true ?
$\hat{\mu}$ is consistent estimator for $\mu$.
$\hat{\mu}$ is biased for $\mu$.
$\bar{X_n}$ is sufficient for $\mu$.
$\hat{\mu}$ = $Min(0,-\bar{X_n})$
What I think:
$\bar{X}$ alone would not be MLE here because of the restriction of the range and $\bar{X}$ can take any value between $(-\infty,\infty)$ .
Rather ,MLE for $\mu$ would be : $$\hat{\mu} = \left\{ \begin{array}{cl} \bar{X} & \bar{X} \ge 0 \\ 0 & \text{Otherwise} \\ \end{array} \right.$$ So, now the question is that is this MLE unbiased? I know that $\bar{X}$ is unbiased for $\mu$, but, here MLE can be $0$ also and $E(0)=0$ which would not be equal to $\mu$ always, So, this is biased and hence, option 2) is correct.
But now, this MLE doesn't seem to be always consistent, because as I said above this is not always limiting unbiased. So, option 1) would be incorrect.
Since variance is known , $\bar{X_n}$ would be sufficient for $\mu$. But, I have a doubt that is sufficiency affected by restriction of the parameter?
So, comparing the last option to the MLE that I have found, it seems only partially correct. As if $\bar{X}$ comes out to be negative then, $Min(0,-\bar{X_n}) = 0$. But, if $\bar{X}$ is positive then, $Min(0,-\bar{X_n}) = -\bar{X_n}$ . That's why this last option seems wrong to me.
I think it would be correct if it was like this: MLE = |$Min(0,-\bar{X_n})$| or Max{$0,\bar{X}$}
Am I correct till here? And if wrong then what would be the estimators in in this case? Kindly explain.