$X_1, X_2, \cdots, X_n$ : i.i.d. $\sim \text{Bernoulli}(p)$. Then $\bar{x}$ is an unbiased estimator of $p$.

69 Views Asked by At

Let $X_1, X_2, \cdots, X_n$ be i.i.d. $\sim \text{Bernoulli}(p)$. Then $\bar{x}$ is an unbiased estimator of $p$.

How should I approach for this types of problems. Some hint will also help me.

3

There are 3 best solutions below

0
On BEST ANSWER

You know that $E(X) = p$ or, for any $i$, $E(X_i) = p$.

So,

$$E(\bar X) = E\Bigl(\frac {\sum_{i=1}^n X_i}{n}\Bigl)$$ $$=\frac{1}{n}\Big(E{\sum_{i=1}^n X_i}\Big)$$ $$=\frac{1}{n}\Big({\sum_{i=1}^n E(X_i)}\Big)$$ $$=\frac{1}{n}(np)$$ $$=p$$

By the definition of Unbiased Estimator, $\bar X$ is an unbiased estimator of $p$.

0
On

Use the fact that $X_i$ are Identical. And the linearity property of Expectation.

$E(\bar{X})=E(X_1)=p$

0
On

If you define $\bar{x}$ as the sample mean, i.e. $$\bar{x} = \frac{1}{n}\sum_{i=1}^n X_i$$ Take expectations on both sides $$E \bar{x} = E \frac{1}{n}\sum_{i=1}^n X_i$$ Since expectation is a linear operator, then $$E \bar{x} = \frac{1}{n}\sum_{i=1}^n E X_i$$ But since $E X_i = p$, then $$E \bar{x} = \frac{1}{n}\sum_{i=1}^n p$$ Since $p$ is a constant, then we could extract it as $$E \bar{x} = \frac{p}{n}\sum_{i=1}^n 1$$ Now since $\sum_{i=1}^n 1 = n$, we get $$E \bar{x} = \frac{p}{n}n = p = \mu$$ where $\mu = p$ is the true mean of the Bernoulli distribution, hence we say that $\bar{x}$ is an unbiased estimator of $p$, since in average, it gives us $p$.