Let $\theta$ be a random parameter with support $[0,1]$ and positive density, and let $X_1,X_2,\ldots \sim \rm N(\theta,1)$ be its i.i.d. observations. Define $$ \delta_n := \sqrt{n} \Big(\hat \theta_n(X_1,\ldots, X_n) - \rm E[\theta| X_1,\ldots, X_n] \Big),$$ where $\hat\theta_n(X_1,\ldots, X_n)$ is the maximum likelihood estimator and $\rm E[\theta| X_1,\ldots, X_n]$ is the Bayesian minimum MSE estimator. Then, (i) $\delta_n\to 0$ as $n\to \infty$, (ii) there exists $K\in \mathbb R$ such that $\delta_n<K$ for all $n$.
The proof I have in mind is very technical, I'm wondering if there is any easy way to prove it using a well-known theorem.