In parameter estimation using $N$ i.i.d. random variables $X_i$ distributed with pdf $f(x|\theta)$, we have the result under reasonable conditions that: $\sqrt{N}(\hat{\theta}_N-\theta)\rightarrow N(0,1/I(\theta))$ in distribution where $\hat{\theta}$ is the MLE and $I(\theta)$ is the Fisher information.
Now, convergence in distribution is a very weak pointwise notion. I would like to say something that bounds MSE($\hat{\theta}_N$)$\equiv\mathbb{E}[(\theta_N-\theta)^2]$ with $N$. But I couldn't find a result like this anywhere. I wonder if someone could help! My particular case of interest is the Bernoulli distribution (coin toss) with "head" probability $f(1|\theta)=\frac{1+\text{cos}(I\theta)}{2}$.