Deriving Variance for a General Poisson Random Variable

80 Views Asked by At

Suppose I wanted to prove, using infinite series, that the variance of a Poisson random variable $X$ with parameter $\mu$ is $\hbox{Var}[X] = \mu$. I'm having some trouble constructing this proof with a change of variables, and I think I must be missing a simple power-series expansion somewhere.

Here's what I have thus far. Let's take the expected value, $\mu$, for granted. (I hadn't any issue proving this.) The real issue is the second moment, $\hbox{E}[X^2]$. If I could manage to piece this together, the proof would be reduced to simple algebra, using the standard formula, $\hbox{Var}[X] = \hbox{E}[X^2] - \Big(\hbox{E}[X]\Big)^2$. \begin{align*} \hbox{E}[X^2] & = \sum\limits_{x=1}^{\infty} x^2 \cdot P[X = x] \\ & = \sum\limits_{x=1}^{\infty} x^2 \cdot \frac{e^{-\mu} \cdot \mu^x}{x!} \\ & = \sum\limits_{x=1}^{\infty} x \cdot \frac{e^{-\mu} \cdot \mu^x}{(x-1)!} \\ & = e^{-\mu} \sum\limits_{x=1}^{\infty} x \cdot \frac{\mu^x}{(x-1)!} \end{align*} Let's take $m = x - 1$. \begin{align*} e^{-\mu} \sum\limits_{m=0}^{\infty} (m+1) \cdot \frac{\mu^{m+1}}{m!} & = \mu e^{-\mu} \sum\limits_{m=0}^{\infty} (m+1) \cdot \frac{\mu^{m}}{m!} \\ & = \mu e^{-\mu} \Bigg[\sum\limits_{m=0}^{\infty} \Bigg(\frac{m \mu^{m}}{m!} + \frac{\mu^{m}}{m!}\Bigg)\Bigg] \\ & = \mu e^{-\mu} \Bigg[\sum\limits_{m=0}^{\infty} \frac{m \mu^{m}}{m!} + \sum\limits_{m=0}^{\infty}\frac{\mu^{m}}{m!}\Bigg] \\ & = \mu e^{-\mu} \Bigg[\sum\limits_{m=0}^{\infty} \frac{\mu^{m}}{(m-1)!} + e^{-\mu}\Bigg] \end{align*} This is the point at which I get stuck. I don't know of a power series with this representation, and thus don't quite know how to solve for $\sum\limits_{m=0}^{\infty} \frac{\mu^{m}}{(m-1)!}$. I've worked this out a few times, though it's very possible, also, that I've made an error somewhere.

I'd appreciate any help on this. My goal isn't necessarily to find an expression for the second moment--I know what it should look like since I have the variance and expectation and can back it out--but rather to prove it with a method similar to what I've done.

2

There are 2 best solutions below

4
On

You have a couple of minor issues and your final line should be $$E[X^2]=\mu e^{-\mu} \Bigg[\sum\limits_{m=1}^{\infty} \frac{\mu^{m}}{(m-1)!} + e^{\mu}\Bigg]$$

which, letting $n=m-1$, you can write as $$=\mu e^{-\mu} \Bigg[ \mu \sum\limits_{n=0}^{\infty} \dfrac{\mu^{n}}{n!}+ e^{\mu}\Bigg] = \mu e^{-\mu}[\mu e^{\mu}+e^{\mu}]=\mu^2+\mu$$

from which you can subtract $\mu^2$ to get the variance as $\hbox{Var}[X] = \mu$

1
On

Note that $$ \begin{align} \mu e^{-\mu} \sum\limits_{m=0}^{\infty} (m+1) \cdot \frac{\mu^{m}}{m!} &= \mu e^{-\mu} \frac{d}{d\mu}\left( \sum_{m=0}^\infty\frac{\mu^{m+1}}{m!} \right)\\ &=\mu e^{-\mu} \frac{d}{d\mu}(\mu e^\mu)\\ &=\mu e^{-\mu}(\mu+1)e^\mu\\ &=\mu^2+\mu. \end{align} $$

Another way to prove the result. Let $X\sim \text{Poi}(\mu)$. Consider the probability generating function $$ g(t)=\sum_{k=0}^\infty P(X=k)t^k=e^{-\mu}\sum_{k=0}^\infty \frac{(t\mu)^k}{k!} =e^{-u}e^{t\mu}=\exp(\mu(t-1))\tag{0} $$ and observe that $$ g''(t)=\sum_{k=0}^\infty P(X=k)k(k-1)t^{k-2}\implies g''(1)=EX(X-1). $$ By equation $(0)$, $$ g''(1)=EX(X-1)=EX^2-(EX)^2=\mu^2 $$ whence $$ EX^2=EX(X-1)+EX=\mu^2+\mu. $$