I had the following problem:
Let $X_j\sim \mathcal{Be}(p)$ be Bernoulli random samples. Show that the mean square error of the estimator $\overline{X}=\frac{1}{n}\sum_{j=1}^n X_j$ for $p$ goes to $0$ as $n\rightarrow\infty.$
Assuming independence, I got $Var(\overline{X})=\frac{\sigma^2}{n},$ where $\sigma^2$ is the variance of the $X_j,$ which works. But is this assumption necessary? I tried constructing a counterexample where the MSE does not converge, with no success. Can you provide one?
You mean: is independence necessary? Yes. If $X_j$ can be dependent, let all of them be equal $X_j = X_1 \forall j$. (I.e. you flip a coin once, and just read the same result every time forever after.) Then $\overline{X} = X_1$ for all $n$, and it has constant MSE $=\sigma^2$ which does not $\to 0$ as $n \to \infty$.