Given $X \sim B(n, p)$, we know that $\hat{p} = X / n$ is the obvious estimator for unknown parameter $p$, and the following quantity $$\frac{\hat{p}(1-\hat{p})}{n-1}$$ has the property that its expected value is the mean squared error of the estimator.
My question is, does there exist a similar quantity for absolute error? If not, why? To be more specific, I am looking for a function $f$, such that $$\mathbb{E}\big\{~f(n, \hat{p})~\big\} = \mathbb{E}\big\{~\big|~\hat{p}-p~\big|~\big\} $$
So far, all I know is that, when $n$ is large, $\hat{p}$ is approximately distributed as a Gaussian, whose mean absolute error and standard deviation differs by a multiple of $\sqrt{\pi/2}$. Is there anything we can say about small $n$'s? My rough intuition is that the mean absolute error involves high-order moments (even to infinity?) so we do not have unbiased estimator for finite sample size. Is that somehow right? Thanks!