Let $X_1,...,X_n$ be a random sample of size $n$ from a Bernoulli distribution with parameter $p$ where $0< p< 1$ is unkown. (a) Find $\theta^2=Var(\bar{X}).$ (b) Find the value of $c$ so that $c\bar{X}(1-\bar{X})$ is an unbiased estimator of $\theta^2$.
My attempt:
(a) Since we are sampling from a Bernoulli distribution, $\sum_{i=1}^n X_i$~$Binomial(n,p)$. Hence, $\mu_{\bar{X}}=\frac{1}{n}\mu_{\sum Xi}=\frac{1}{n}np=p,$ and
$Var(\bar{X})=\frac{1}{n^2}Var(\sum X_i)=\frac{1}{n^2}np(1-p)=\frac{p(1-p)}{n}$
(b)$E(\bar{X}^2)=V(\bar{X})+\mu^2_{\bar{X}}=\frac{p(1-p)}{n}+p^2$. Hence, $E(c\bar{X}(1-\bar{X})=c[E(\bar{X})-E(\bar{X}^2)]=c[p-(\frac{p(1-p)}{n}+p^2)]=c(\frac{np-p+p^2-np^2}{n}).$ To be unbiased, this must equal $Var(\bar{X})$. Hence $c(\frac{np-p+p^2-np^2}{n})=\frac{p(1-p)}{n}\Rightarrow c=\frac{1-p}{n-1+p-np}.$
Does this look ok? My answer for part (b) doesn't seem right to me
You did fine.
Denoting $q=1-p$ (higly recommended in cases like these) you come (correctly) to:$$\theta^{2}=n^{-1}pq$$ and: $$\mathbb{E}\bar{X}\left(1-\bar{X}\right)=n^{-1}\left(n-1\right)pq$$ Then equation $$cn^{-1}\left(n-1\right)pq=\theta^{2}=n^{-1}pq$$ leads to $$c=\frac{1}{n-1}$$
You only forgot one (last) step: $$\frac{1-p}{n-1+p-np}=\frac{1-p}{\left(n-1\right)\left(1-p\right)}=\frac{1}{n-1}$$