Expected number of draws

1.9k Views Asked by At

A bag contains $3$ red balls and $3$ black balls. If the drawn ball is red, it is put back into the bag. If it is black, then we stop there.

How to calculate expected number of draws to get a black ball?

Edit: I'm not able to come up with a suitable approach. I'm a beginner in probability theory. I tried to apply Bernoulli trials for this but it's not apt, I guess.

3

There are 3 best solutions below

3
On

Let $R$ be picking red ball, $B$ be picking black ball. $P(R)=P(B)=\frac{1}{2}$.

Now, the probability of stopping at one draw is $P(B)$, probability of stopping at 2 draw = $P(R)*P(B)$ (Where $P(R)$ is also the probability of reaching the second draw) and so on. Thus, the expected number of draws are $$E(\text{draws})=P(B)+2P(R)P(B)+3P(R)^2P(b)+....$$ $$=\frac{1}{2}+2\frac{1}{2^2}+ 3\frac{1}{2^3}+...$$ $$ $$

Now,$\frac{x}{(1-x)^2}=x+2x^2+3x^3+...$ So that the previous summation reduces to $\frac{0.5}{0.5^2}=2$.

Thus the expected number of draws is $2$

1
On

The following trick is generally extremely useful. Let $E$ be expected number of draws until the black ball is drawn. If the black ball is picked up in the first draw we are done. Otherwise we are back in the initial state with one draw lost. Thus: $$ E=p_B\cdot1+p_R\cdot(E+1)=1+\frac{1}{2}E \Rightarrow E=2. $$

The full power of the method can be demonstrated on a more complicated problem. Assume you are asked to find the expectation value of the event to draw two black balls one after the other (with replacement).

Then: $$ E=p_R(E+1)+p_Bp_R(E+2)+p_B^2\cdot2=>E=\frac{2p_B^2+2p_Bp_R+p_R}{1-p_R-p_Bp_R} \Big|_{p_B=p_R=\frac{1}{2}}=6. $$

Similarly can be computed the expectation value for a series of black balls of arbitrary length.

0
On

Edit: I'm not able to come up with a suitable approach. I'm a beginner in probability theory. I tried to apply Bernoulli trials for this but it's not apt, I guess.

It is apt.

You seek the expected count until the first success for independent Bernoulli trials with identical success rate, $1/2$.   This count, oh, let us call it $X$, is a random variable with a geometric distribution, over a support of $\{1,2,3,\ldots\}$. $$X\sim\mathcal{Geo}_1(1/2)$$

What does this mean?   Why now, the expectation for a geometric random variable is well known, so once you've identified the distribution you could know its two.   Else, you can do it from first principles:

Well the probability that the first success is encountered on trial #$x$ (for any $x\in\Bbb N^+$) is the probability for $x-1$ consecutive fails and then a success. $$\begin{align}\mathsf P(X{=}x) ~&= {(1-1/2)}^{x-1}(1/2)\mathbf 1_{x\in\Bbb N^+}\\ &= {1/2}^x~\mathbf 1_{x\in\Bbb N^+}\end{align}$$

Likewise, the probability that the first success in encountered after trial #$x$, (for any $x\in \Bbb N$) is the probability for obtaining $x$ consecutive fails: $$\mathsf P(X>x)={(1-1/2)}^{x}\mathbf 1_{x\in\Bbb N}$$

Why is this useful?   It is because when we apply the definition of expectation this happens: $$\begin{align}\mathsf E(X) &= \sum_{x=1}^\infty x\mathsf P(X=x) &=& \tfrac 1{2}+\tfrac 2{2^2}+\tfrac 3{2^3}+\cdots+\tfrac x{2^x}+\cdots \\ &= \sum_{x=1}^\infty\sum_{y=0}^{x-1} \mathsf P(X=x)\\ & = \sum_{y=0}^\infty \sum_{x=y+1}^\infty \mathsf P(X=x) \\ & = \sum_{y=0}^\infty \mathsf P(X>y)\\ &=\sum_{y=0}^\infty \frac 1{2^y} &=& 1+\tfrac 12+\tfrac 1{2^2}+\tfrac 1{2^3}+\cdots+\tfrac 1{2^y}+\cdots \end{align}$$

Thus the expectation for a geometric random variable equals some kind of series.   You should readily be able to recognise it and find that the closed form is two.


Ps: You should also notice that $\mathsf E(X)=1+\tfrac 12\mathsf E(X)$, and @user gives an nice intuitive explanation for why this is so.