proving unbiasedness of an estimator

241 Views Asked by At

Question given independent random variable $X_{1},X_{2},...,X_{n}$ from a geometric distribution with parameter $p$. we have an estimator for $p$, mainly $T=Y/n$ where Y is number of $i$ that $X_{i}=1 $. show that estimator $T$ is an unbiased estimator.

I know that an unbiased estimator is an estimator when $E[T]=\Theta$ where $\Theta$ is the parameter of interest, so here we are interested in $p$. answer is given to this exercise but for me is very unclear, I would appreciate if somebody could clarify for me.

here is the Solution

$nT$ has a bionomial distribution with parameter $p$. the expectation of $nT$ is thus $np$. so $T$ is unbiased.

1

There are 1 best solutions below

4
On BEST ANSWER

To check if your estimator is consistent, you want to, as you said, compute $\mathbb{E}[T]$. In your case that is $$ \mathbb{E}[T] = \mathbb{E}[Y/n] = \frac{1}{n}\mathbb{E}[Y]. $$ Now $Y$ is the is number of $i$ that $X_i=1$, or, equivalently, $Y=\sum_{i=1}^n\mathbb{1}_{X_i=1}$, where $\mathbb{1}_E$ denotes the indicator function of an event $E$, i.e. $$ \mathbb{1}_{E}=\begin{cases}1 & \text{if $E$ occurs,} \\ 0 & \text{if $E$ doesn't occur}\end{cases}. $$

This implies, since the expectation of a sum is the sum of expectations (linearity) that $$ \frac{1}{n}\mathbb{E}[Y] = \frac1n\mathbb{E}\left[\sum_{i=1}^n\mathbb{1}_{X_i=1}\right] = \frac1n\sum_{i=1}^n\mathbb{E}[\mathbb{1}_{X_i=1}]. $$ It remains to compute $\mathbb{E}[\mathbb{1}_{X_i=1}]$ which is the same for each $i$ since the $X_i$ all have the same distribution.

The probability distribution of the random variable $\mathbb{1}_{X_i=1}$ is $$ \mathbb{P}(\mathbb{1}_{X_i=1}=0) = \mathbb{P}(X_i\neq1),\quad \mathbb{P}(\mathbb{1}_{X_i=1}=1) = \mathbb{P}(X_i=1). $$

To proceed from here, you will have to specify what version of the geometric distribution you are working with: Is it supported on $\{0,1,2,3,\ldots\}$ or $\{1,2,3,\ldots\}$?

Assuming the latter, it follows that $\mathbb{P}(X_i\neq1)=1-p$ and $\mathbb{P}(X_i=1)=p$ and thus $$ \mathbb{E}[\mathbb{1}_{X_i=1}] = 0\times(1-p) + 1\times p=p $$ and $$ \mathbb{E}[T]=\frac1n\sum_{i=1}^np=\frac1n np = p. $$

This shows that your estimator is consistent.

To clarify the answer that you're given you should observe that the $\mathbb{1}_{X_i=1}$ are independent Bernoulli random variables with parameter $p$. Thus $nT$ is the sum of $n$ independent Bernoulli random variables, and such a sum is known to follow a binomial distribution with parameters $n$ and $p$. A binomial distribution $\operatorname{Bin}(n,p)$ with parameters $n$ and $p$ has indeed expectation $np$, and thus $$ \mathbb{E}[nT] = np, $$ or, by the linearity of $\mathbb{E}[\cdot]$, $\mathbb{E}[T]=p$.