The Cramer-Rao bound tells that under some assumptions, the variance of an unbiased estimator $\hat{p}$ of a parameter $p$ verifies $Var(\hat p)\geq \frac{1}{ \mathcal I(p)}$.
In some cases, the estimator reaches the bound, for example $X_1,\dots, X_n \sim Poi(\lambda)$ and $\hat \lambda=\overline X$. We can verify that $\hat \lambda$ is an unbiased estimator of $\lambda$ and $Var(\hat \lambda)=\frac{1}{ \mathcal I(\lambda)}$.
However, I was told that there exists some cases where any unbiased estimator $\hat{p}$ of a parameter $p$ verifies $Var(\hat p)\neq\frac{1}{ \mathcal I(p)}$. I tried to find such a case :
I thought of finding all unbiased estimator $\hat{p}$ of a parameter $p$ in an RV $X \sim Ber(p)$, then $\hat p = U(X)$ such that $p=\Bbb E(U(X))=U(0)\Bbb P(X=0)+U(1)\Bbb P(X=1)=U(0)(1-p)+U(1)p$. By making $p$ go to $0$, we have that $U(0)=0$ and it gives $U(1)=1$. Hence, a unbiased estimator of $p$ is a function $U$ that verifies the previous conditions, for example $U=Id$ $(\hat p=X)$ so $X$ is an unbiased estimator of $p$. Then some computations led me to the (unlucky) conclusion that any unbiased estimator of $p$ verifies $Var(\hat p)=\frac{1}{ \mathcal I(p)}$. It is a nice result but totally not the one I hoped for. Can have an example of such a case I am looking for ?
Yes -- take any model not belonging to the exponential family.
The proof of the CRLB hinges on applying Cauchy-Schwarz in the form $$\operatorname{Cov}(\hat p,Z)\leq\operatorname{Var}(\hat p)\operatorname{Var}(Z)$$ where $Z=\frac{\partial}{\partial p}\log f(X,p)$. So equality is only possible if (for each given $p$) we have $\hat p-p$ is a scalar multiple of $Z$, i.e. $$\frac{\partial}{\partial p}\log f(X,p)=a(p)(\hat p-p).$$ So integrating gives $$\log f(X,p)=A(p)\cdot\hat p(X)+B(p)+C(X),$$ which is precisely the condition for $F(X,p)$ to be from the exponential family.