Exercise:
A biased coin has a probability $p$ that it gives a tail when it is tossed. The random variable $T$ is the number of tosses up to and including the second tail.
Show that $\frac{1}{T-1}$ is an unbiased estimator of $p$.
My work so far:
I know that $P(T=t) = (t-1)(1-p)^{t-2}p^2$ for $t \ge 2$.
I know that $E(T) = \frac{2}{p}$.
I found a similar question at Finding an unbiased estimator for the negative binomial distribution, but I don't understand the first line (!) of the solution, which states:
$$E\left(\frac{r-1}{Y-r-1}\right)=\sum_{y=0}^\infty \frac{r-1}{y+r-1}\binom{y+r-1}{y} \theta^r(1-\theta)^y$$
Can someone please explain where the above expectation expression comes from?
Many thanks!
\begin{align} & \operatorname E\left( \frac 1 {T-1} \right) = \sum_{t=2}^\infty \frac 1 {t-1} \Pr(T=t) = \sum_{t=2}^\infty \frac 1 {t-1} (t-1)(1-p)^{t-2}p^2 \\[10pt] = {} & p^2 \sum_{t=2}^\infty (1-p)^{t-2} = p^2 \times \text{sum of a geometric series} = p^2 \times \frac{\text{first term}}{1 - \text{common ratio}} \\[10pt] = {} & p^2 \times \frac 1 {1-(1-p)} \end{align}