I am reading Rabiner's paper entitled "A tutorial on hidden markov models and selected applications in speech recognition". There is a very simple example where he simplifies the calculation of an expectation that I don't follow, I assume that this is basic (which worries me), I just can't seem to reach it :(
$$ E[d]=\sum_{d=1}^{\infty}d·P(d) \\ E[d]=\sum_{d=1}^{\infty}d·(a)^{d-1}(1-a) \\ E[d]=\frac{1}{1-a} $$
In particular this is equation 6a in the paper. And here I ommited subindices of a and P, cause they are not needed. I just want to find the way he simplifies it.
thanks!
For a hint write \begin{eqnarray*} \sum_{d = 1}^{\infty} a^d & = & \frac{1}{1 - a} \end{eqnarray*} Take derivatives on both sides with respect to $a$ (I will not say rigorously why it is allowed here) \begin{eqnarray*} \sum_{d = 1}^{\infty} da^{d - 1} & = & \frac{1}{\left( 1 - a \right)^2} \end{eqnarray*} Multiply by $\left( 1 - a \right)$ on both sides to conclude $$E[d]=\frac{1}{1-a}$$