Let $X_1, X_2, \ldots, X_n$ be a random sample from a population with pmf $$P_\theta(X=x)=\theta^x(1-\theta)^{1-x}, \quad x=0,1; \qquad 0 \leq \theta \leq \frac{1}{2}$$ Compare the method of moment estimator (MME) and the maximun likelihood estimator (MLE), which one is preferred?
Since $E(X)=\theta$, so $\hat{\theta}_\text{MME}=\bar{X}$ anyway (what if $\bar{X} > \frac{1}{2}$?).
And by writing out the likelihood function and taking derivative I got $$\hat{\theta}_\text{MLE}=\begin{cases}\bar{X} & \text{ if }0\leq \bar{X} \leq \frac{1}{2}\\ \frac{1}{2} & \text{ if } \bar{X}>\frac{1}{2}\end{cases}$$ It seems the MLE is better, but is there a justification here?
It is quite possible that observed value of the sample mean $\overline X(=\hat\theta_{\text{MME}})$ does not satisfy the constraint $\theta\in[0,1/2]$. So in those cases, $\overline X$ will not be an entirely appropriate estimator of $\theta$.
But as you rightly obtained, $\hat\theta_{\text{MLE}}=\overline XI_{0\le \overline X\le \frac12}+\frac12I_{\overline X>\frac12}$. So the constraint $\theta\in[0,1/2]$ is well taken care of. In general, MLE is always 'better' in the sense that you obtain the estimator directly based on given data. Besides, it has other appealing optimal properties.
Formally, it can be shown here that MLE is better than MME in terms of mean square error (MSE):
For $\theta\in[0,\frac12]$,
\begin{align} \operatorname{MSE}_{\theta}(\hat\theta_{\text{MLE}})&=\mathbb E_{\theta}(\hat\theta_{\text{MLE}}-\theta)^2 \\&=\sum_{0\le j\le \frac12}(j-\theta)^2\mathbb P_{\theta}(\overline X=j)+\sum_{j>\frac12}\left(\frac12-\theta\right)^2\mathbb P_{\theta}(\overline X=j) \end{align}
And
\begin{align} \operatorname{MSE}_{\theta}(\hat\theta_{\text{MME}})&=\mathbb E_{\theta}(\overline X-\theta)^2 \\&=\sum_{0\le j\le \frac12}(j-\theta)^2\mathbb P_{\theta}(\overline X=j)+\sum_{j>\frac12}(j-\theta)^2\mathbb P_{\theta}(\overline X=j) \end{align}
So for every $\theta\in[0,\frac12]$,
\begin{align} \operatorname{MSE}_{\theta}(\hat\theta_{\text{MLE}})-\operatorname{MSE}_{\theta}(\hat\theta_{\text{MME}})&=\sum_{j>\frac12}\left[\left(\frac12-\theta\right)^2-(j-\theta)^2\right]\mathbb P_{\theta}(\overline X=j) \end{align}
Since $\left(\frac12-\theta\right)^2-(j-\theta)^2=\left(\frac12+j-2\theta\right)\left(\frac12-j\right)<0$ for $j>\frac12$ and $\theta\in[0,\frac12]$,
$$\operatorname{MSE}_{\theta}(\hat\theta_{\text{MLE}})<\operatorname{MSE}_{\theta}(\hat\theta_{\text{MME}})\quad,\forall\,\theta\in\left[0,\frac12\right]$$