Is $Y_n=(1-\frac{a}{n})^{X_1+\cdots+X_n}$ an unbiased estimator?

132 Views Asked by At

Let $X_1, X_2, \ldots$ be i.i.d. and have the Poisson distribution with parameter $\lambda$. We define the estimator of $y=e^{-a\lambda}$ (where $a \neq 0$ is some constant value) as: $$Y_n= \left( 1-\frac a n \right)^{\sum\limits_{i=1}^nX_i}$$ Is it an ubiased estimator of $y$?

So $E(X_i) = \lambda$. I'm trying to calculate the bias with:

$$E(Y_n)=E\left((1-\frac a n)^{\sum\limits_{i=1}^nX_i}\right) = E\left( (1-\frac{a}{n})^{X_1} \cdots (1-\frac{a}{n})^{X_n} \right)$$

which, since $X_i$ are i.i.d., equals:

$$=\left( E\left((1-\frac{a}{n})^{X_1}\right)\right)^n$$

I wanted to go further by creating an additional random variable defined as $Z_i = (1-\frac{a}{n})^{X_i}$, calculating its expected value and then plugging it into the formula above, but I have problems with doing so. How should I approach this? $E(Z_i)=(1-\frac{a}{n})^{x}\cdot \frac{\lambda^x e^{-\lambda}}{x!}$ seems awful to calculate and I'm not even sure if it is the correct way.

4

There are 4 best solutions below

0
On BEST ANSWER

$$E\left(\left(1-\frac{a}{n}\right)^X\right) = \sum_{x=0}^{\infty} \left(1-\frac{a}{n}\right)^x e^{-\lambda}\frac{\lambda^x}{x!} = e^{-\lambda} e^{\lambda\left(1-\frac{a}{n}\right)} = e^{\frac{-a\lambda}{n}}$$

$$E\left(\left(1-\frac{a}{n}\right)^X\right)^n = e^{\frac{-na\lambda}{n}} = e^{-a\lambda}$$

0
On

Let $\theta:=\lambda(1-\frac{a}{n})$. Then $$ \mathbf{E}[Z_1]=\sum_{x\ge 0}\left(1-\frac{a}{n}\right)^n\frac{\lambda^n x^{-\lambda}}{x!}=\sum_{x\ge 0}\frac{\theta^nx^{-\lambda}}{x!}=e^{\theta-\lambda}=e^{-\lambda \frac{a}{n}}. $$ Therefore $\mathbf{E}[Y_n]=e^{-a\lambda}$.

1
On

The expectation can be computed in brute-force: $$ \mathbb E\left[\left(1-\frac{a}{n}\right)^{X_1}\right] = \sum_{t=0}^{\infty}{\left(1-\frac{a}{n}\right)^{t}\frac{\lambda^te^{-\lambda}}{t!}} = e^{-\lambda}\sum_{t=0}^{\infty}{\frac{(\lambda(1-a/n))^t}{t!}} = e^{-\lambda}e^{(1-a/n)\lambda} = e^{-a\lambda/n}. $$ Subsequently, $\mathbb E[Y_n] = e^{-a\lambda}$, which is unbiased for estimating $y=e^{-a\lambda}$.

3
On

$\newcommand{\E}{\operatorname{E}}$Here it is for $a=1.$

First notice that $\Pr(X_1=0) = e^{-\lambda},$ so the statistic $J$ that is equal to $1$ if $X_1=0$ and to $0$ otherwise is an unbiased estimator of $e^{-\lambda}.$

Now observe that

  • the conditional distribution of $X_1,\ldots,X_n$ given $X_1+\cdots+X_n$ does not depend on $\lambda;$

  • therefore the conditional expected value of $J$ given $X_1+\cdots+X_n$ does not depend on $\lambda,$ and can therefore be used as an estimator;

  • and the law of total expectation says that the conditional expected value of $J$ given $X_1+\cdots+X_n$ has the same expected value that $J$ has, i.e. $e^{-\lambda},$ so that is an unbiased estimator of $e^{-\lambda}.$

So now the problem is just to compute $$\E(J\mid X_1+\cdots+X_n).$$ We have

\begin{align} \E(J\mid X_1+\cdots+X_n=s) & = \Pr(X_1=0\mid X_1+\cdots+X_n=s) \\[10pt] & = \frac{\Pr(X_1=0\ \&\ X_1+\cdots+X_n=s)}{\Pr(X_1+\cdots+X_n=s)} \\[10pt] & = \frac{\Pr(X_1=0)\Pr(X_2+\cdots+X_n=s)}{e^{-n\lambda} (n\lambda)^s/s!} \\[10pt] & = \frac{e^{-\lambda}\cdot e^{-(n-1)\lambda} ((n-1)\lambda)^s/s! }{e^{-n\lambda} (n\lambda)^s/s!} \\[10pt] & = \left( 1 - \frac 1 n \right)^s. \end{align}

Therefore $$ \E\left( \left( 1 - \frac 1 n \right)^{X_1+\cdots+X_n} \right) = \E(\E(J\mid X_1+\cdots+X_n)) = \E(J) = e^{-\lambda}. $$

I don't know whether you already knew the above; I'm guessing maybe you did all that and then conjectured something similar would work for $e^{-a\lambda}.$ If not, then maybe the above is worth something to you.