Expectation of the number of times a coin is thrown until the appearance of a second "tail"

97 Views Asked by At

$X$ is the random variable that signifies the number of times a coin is thrown such until the appearance of a second "tail". With the probability of an appearance of "tails" on one toss being $p$. Find this expectation.

I know that

$$P\{X=k\}=\binom{k-1}{1}p^2(1-p)^{k-2}$$

I just do not know how to calculate the expectation for this.

3

There are 3 best solutions below

0
On BEST ANSWER

From your expression for $\Pr(X=k)$ we have $$E(X)=\sum_{k=2}^\infty k(k-1)p^2(1-p)^{k-2}.$$ Let $$f(x)=\frac{1}{1-x}=1+x+x^2+x^3+x^4+\cdots.$$ Take second derivatives. We get $$f''(x)=\frac{2}{(1-x)^3}=\sum_{k=2}^\infty k(k-1)x^{k-2}.$$ Let $x=1-p$ and multiply the result by $p^2$.

Remark: If you already know the mean of the geometric distribution, here is a simple way. Let $U$ be the number of tosses until the first tail, and let $V$ be the number of tosses between the first tail and the second. Then $X=U+V$, where $U$ and $V$ have geometric distribution. The mean of each of $U$ and $V$ is $\frac{1}{p}$, so the mean of $X$ is $\frac{2}{p}$.

5
On

Since André has given the complete answer, I will finish mine. This relates to the comment by A.S.

The expectation would be $$ \begin{align} \sum_{k=2}^\infty k\binom{k-1}{1}p^2(1-p)^{k-2} &=2p^2\sum_{k=2}^\infty\binom{k}{2}(1-p)^{k-2}\\ &=2p^2\sum_{k=2}^\infty\binom{k}{k-2}(1-p)^{k-2}\\ &=2p^2\sum_{k=2}^\infty(-1)^{k-2}\binom{-3}{k-2}(1-p)^{k-2}\\ &=2p^2\sum_{k=0}^\infty(-1)^k\binom{-3}{k}(1-p)^k\\ &=2p^2\frac1{(1-(1-p))^3}\\ &=\frac2p \end{align} $$ where $\binom{-3}{k-2}$ is a negative binomial coefficient.

0
On

Let $\mu_{n}$ denote the expectation of the number of times a coin is thrown until the appearance of an $n$-th tail.

Then you are looking for $\mu_{2}$.

Let $E$ denote the event that the first throw is a tail.

Then $\mu_0=0$ and $\mu_{n}=\left(1+\mu_{n-1}\right)P\left(E\right)+\left(1+\mu_{n}\right)P\left(E^{c}\right)$ for $n>0$, so that:

$\mu_{1}=1P\left(E\right)+\left(1+\mu_{1}\right)P\left(E^{c}\right)=1+\mu_{1}\left(1-p\right)$ implying that $\mu_{1}=\frac{1}{p}$.

and:

$\mu_{2}=\left(1+\mu_{1}\right)P\left(E\right)+\left(1+\mu_{2}\right)P\left(E^{c}\right)=\left(1+\frac{1}{p}\right)p+\left(1+\mu_{2}\right)\left(1-p\right)$ implying that $\mu_{2}=\frac{2}{p}$.

With induction it can be shown that $\mu_n=\frac{n}{p}$.