Proving variance of geometric distribution

4.4k Views Asked by At

Of course my textbook leaves it as an exercise.... can someone help walk me through the derivation of the variance of a geometric distribution? Using the book (and lecture) we went through the derivation of the mean as:

$$ E(Y)=\sum_{y=0}^n yP(y)=\sum_{y=0}^n ypq^{y-1} $$ $$ =p\sum_{y=0}^n (-1)(1-p)^y =-p\sum_{y=0}^n(\frac{d}{dp}(1-p)^y -1) $$ By some theorem that's apparently outside the scope of our class: $$ =-p\frac{d}{dp}(\sum_{y=0}^n (1-p)^y -1)=-p\frac{d}{dp}(\frac{1}{1-(1-p)}-1) $$ $$ =-p\frac{d}{dp}(\frac{1}{p}-1)=-p(-\frac{1}{p^2}) $$ $$ \therefore E(Y)=\frac{1}{p} $$ From there we were given a hint that double derivatives will be needed for the variance. (my sigma notation might need correcting...)

2

There are 2 best solutions below

2
On BEST ANSWER

Here's a derivation of the variance of a geometric random variable, from the book A First Course in Probability / Sheldon Ross - 8th ed. It makes use of the mean, which you've just derived.

To determine Var$(X)$, let us first compute $E[X^2]$. With $q = 1 − p$, we have $$ \begin{align} E[X^2] & = \sum_{i=1}^\infty i^2q^{i-1}p \\ & = \sum_{i=1}^\infty (i-1+1)^2q^{i-1}p \\ & = \sum_{i=1}^\infty (i-1)^2q^{i-1}p + \sum_{i=1}^\infty 2(i-1)q^{i-1}p + \sum_{i=1}^\infty q^{i-1}p\\ & = \sum_{j=0}^\infty j^2q^jp + 2\sum_{j=1}^\infty jq^jp + 1 \\ & = qE[X^2] + 2qE[X] + 1 \\ \end{align} $$ Using $E[X] = 1/p$, the equation for $E[X^2]$ yields $$pE[X^2] = \frac{2q}{p} + 1 $$ Hence, $$E[X^2] = \frac{2q+p}{p^2} = \frac{q+1}{p^2}$$ giving the result $$ Var(X) = \frac{q+1}{p^2} - \frac{1}{p^2} = \frac{q}{p^2} = \frac{1-p}{p^2} $$

1
On

Oh, yeah... That was misscopied.   Here is how it should go.

$$\begin{align} \tag 1\mathsf E(Y) &= \sum_{y} y~\mathsf P_Y(y)&&\raise{1ex}{\text{definition of expectation for }\\\quad\text{a discrete random variable}} \\[1ex]\tag 2 &= \sum_{y=1}^\infty y~p(1-p)^{y-1}&&\text{since }Y\sim\mathcal{Geo}_1(p) \\[1ex]\tag 3 &= p\sum_{z=0}^\infty (z+1)(1-p)^z &&\text{change of variables }z\gets y-1 \\[1ex]\tag 4 &= p\sum_{z=0}^\infty\dfrac{\mathrm d~~}{\mathrm d p}(-(1-p)^{z+1})&&\text{derivation} \\[1ex]\tag 5 &=p~\dfrac{\mathrm d~~}{\mathrm d p}\sum_{z=0}^\infty\left(-(1-p)^{z+1}\right)&&\text{Fubini's Theorem} \\[1ex]\tag 6 &=p~\dfrac{\mathrm d~~}{\mathrm d p}\left(-(1-p)\sum_{z=0}^\infty(1-p)^{z}\right)&&\text{algebra} \\[1ex]\tag 7 &=p~\dfrac{\mathrm d~~}{\mathrm d p}\left(\dfrac{-(1-p)}{1-(1-p)}\right)&&\text{Geometric Series} \\[1ex]\tag 8 &=p~\dfrac{\mathrm d~~}{\mathrm d p}\left(1-p^{-1}\right)&&\text{algebra} \\[1ex]\tag 9 &=p~\cdot~p^{-2}&&\text{derivation} \\[1ex]\tag {10} &=\dfrac 1{p}&&\text{algebra} \end{align}$$

Also, this is the mean, not the variance.