Let $X_1, X_2, \ldots, X_N$ be an i.i.d. random sample of Bernoulli random variables, with $\mathbb{P}(X_i =1) = p$ and $\mathbb{P}(X_i = 0) = 1 − p$.
I'm confused as to why $1 − X^{-}$ is an unbiased estimator of $1 − p$ ... My colleague said it had to do with Jensen's inequality, but I cannot see the connection there.
Any help in understanding this would be appreciated :)
It has nothing to do with Jensen's Inequality. We need to show that $E(1-\bar{X})=1-p$. By the linearity of expectation, $E(1-\bar{X})=1-E(\bar{X})$. And again by the linearity of expectation $$E(\bar{X})=E\left(\frac{X_1+\cdots+X_N}{N}\right)=\frac{1}{N}\left(E(X_1)+\cdots+E(X_N)\right)=p,$$ since $E(X_i)=(1)(p)+(0)(1-p)=p$.