Suppose a random variable $X$ has a binomial distribution with parameters $n$ (the number of independent trials) and $\theta $ (the probability of success on any trial).
Define the estimator of $\theta$: $P_1 = \dfrac{X}{n}.$
Question: Show that $P_1$ is the most efficient estimator amongst all unbiased estimators of $\theta$.
I would like to know what the best approach to this question may be as I am really quite unsure.
Would it be reasonable to find the variance of $P_1$ and show that it equals the Cramer-Rao Lower Bound or is there an alternative method?
Cramér Rao inequality works, but there are also a couple of other methods
Lehmann Scheffé Lemma
Build an UMVUE estimator with $\mathbb{E}[T|S]$ (Rao Blackwell)
If you are interested I will briefly summarize you all the 3 methods.
Let's begin with the fastest one:
Lehmann Scheffé Lemma
If the model is Binomial (or Bernulli that is the same model) it is easy to verify that the model belongs to the Exponential Family and its canonical Statistc $T=\sum_i {X_i}$ (number of successes) is Complete Sufficient Statistic (and also Minimal)
$\mathbb{E}[\frac{X}{n}]$ or $\mathbb{E}[\frac{\sum_i X_i}{n}]$ in case of the Bernulli pmf is unibased for $\theta$ as $\mathbb{E}[\frac{X}{n}]=\frac{1}{n}n\theta=\theta$ so the estimator is UMVUE.
For the Cramér Rao inequality, in this case, you can use the Sufficient & Necessary condition that assure you that Cramér Rao inequality becomes an equality so you can easily verify that
$$\sum_{i=1}^n\frac{\partial}{\partial\theta}log f(x_i;\theta)=K(\theta,n)[t-\theta]$$
The third way is to build by your own the UMVUE estimator of $\theta$ defining this estimator as
$\mathbb{E}[T|S]$ where T is an unbiased estimator (an estimator at your choice, the simplest you can find) and S is the CSS (Complete and Sufficient Statistic).
All the calculation above are very easy and fast