We observe $X$, distributed $P_{\theta}$ with $\theta \in R$ unknown. We have two estimators $\hat{\theta}_{1}$ and $\hat{\theta}_{2}$ of $\theta$ such that $$E_{X|\theta}(\hat{\theta}_{1}-\theta)^{2}=E_{X|\theta}(\hat{\theta}_{2}-\theta)^{2}$$ and $$P_{\theta|X}(\hat{\theta}_{1}=\hat{\theta}_{2})=0\\ \forall \theta \in R$$
Is there an estimator $\hat{\theta}$ such that $$E_{X|\theta}(\hat{\theta}-\theta)^{2} < E_{X|\theta}(\hat{\theta}_{1}-\theta)^{2}$$ for all $\theta \in R$?
hint: $E_{X|\theta}(\frac{\hat{\theta}_{1}+\hat{\theta}_{2}}{2}-\theta)^{2}=E_{X|\theta}{\big(\frac{1}{2}(\hat{\theta}_{1}-\theta)+\frac{1}{2}(\hat{\theta}_{2}-\theta)\big)^2}=\frac{1+\rho(\theta_1,\theta_2)}{2}\sigma^2$