Finding the distribution $(_=)$ of $k$ heads when throwing a die n times and only when 4 or 1 is rolled tossing a coin.

82 Views Asked by At

Player $A$ throws the die $n$ times. Whenever 1 or 4 is thrown, he tosses an unbiased coin. Let $X_n$ be the random variable which counts the number of heads from the Bernoulli experiment of tossing a coin. I am trying to find the distribution $P(X_n=k)$ for every $k$, its mean and variance (giving me a hint towards the distribution would be good enough though).

  • I noticed that the number of $4$'s and $1$'s could be counted using a binomial approach, so let $Y$ be a random variable which counts the number of throwing $4$ or $1$.
  • Since $X$ counts the number $k$ heads I have to throw at least $k$ times $4$ or $1$.
    • Giving us $P(Y \geq k) = \sum\limits_{i = k}^n {n \choose i} {2 \over 6}^i {4 \over 6}^{n-i}$
  • My next steps was thinking about how to count the number of heads.
    • The naive approach was just to multiplying $P(Y \geq k)$ with $1 \over 2$ but I can see no possibility to count heads now since I dont know whether a head or tails is flipped.
    • The next thing I came up with was combining this with binomial distribution again for counting k heads.

Something along this way: $P(X = k) = \sum\limits_{i = k}^n {n \choose i} {2 \over 6}^i {4 \over 6}^{n-i} {n \choose k} {1 \over 2}^k {1 \over 2}^{i-k}$

I am not sure whether this is going into the right direction or not and it doesn't seem handy at all. As the next step I would try to use the binomial theorem, but I can't find a way to get this simpler along this way because I have to shift k indices.

2

There are 2 best solutions below

5
On BEST ANSWER

You can write $$ X_n = \sum_{i=1}^n Y_i, $$

where $Y_i=1$ when if you get a head in the $i$-th trial, $Y_i=0$ otherwise. The probability that $Y_i=1$ is $$P(Y_i=1) = P(Y_i=1|1\text{ or }4\text{ appears in the die throw})P(1\text{ or }4\text{ appears in the die throw}) = \frac{1\times 2}{2\times 6}.$$ Thus $\mathbb{E}(X_n)=n/6$. Now for the variance, if you assume the trials are independent for each $i$ then $X_n$ is clearly (sum of Bernoulli's) a binomial that is, $$ P(X_n=k) = {n\choose k}\left(\frac{1}{6}\right)^{k}\left(\frac{5}{6}\right)^{n-k}, $$ for $k=0,1,\ldots,n$. Therefore the variance is $\frac{n}{6}\times\frac{5}{6}$. If they are not independent then you need to compute the covariances depending on the problem.

6
On

You have $Y \sim B(n,\frac{1}{3})$. Continuing the calculation in your way (which is pretty straight forward),

$P(X_n = k)$

$= \sum\limits_{i = k}^n P(X_n = k|Y=i).P(Y=i)$

$ = \sum\limits_{i = k}^n {i \choose k} \left({1 \over 2}\right)^i \left({1 \over 2}\right)^{i-k}. {n \choose i} \left({1 \over 3}\right)^i \left({2 \over 3}\right)^{n-i}$

$ = \left({1 \over 3}\right)^n \sum\limits_{i = k}^n {i \choose k} \left({1 \over 2}\right)^i . {n \choose i} 2 ^{n-i}$

$ = \left({1 \over 3}\right)^n.{n \choose k} \sum\limits_{i = k}^n {n-k \choose i-k} 2 ^{n-2i}$

$ = \left({1 \over 3}\right)^n.{n \choose k} \sum\limits_{j = 0}^{n-k} {n-k \choose j} 2 ^{n-2j-2k}$ (let $j=i-k$)

$ = \left({1 \over 3}\right)^n. 2^{n-2k}.{n \choose k} \sum\limits_{j = 0}^{n-k} {n-k \choose j} \left({1 \over 4}\right) ^j$

$ = \left({1 \over 3}\right)^n. 2^{n-2k}.{n \choose k} (\frac{5}{4})^{n-k}$

$= {n \choose k} \left(\frac{1}{6}\right)^n.5^{n-k}$

$= {n \choose k} \left(\frac{1}{6}\right)^k.\left(\frac{5}{6}\right)^{n-k}$ (p.m.f. of a Binomial distribution)

$\Rightarrow X_n \sim B(n, \frac{1}{6})$

$\Rightarrow E[X_n] = n.\frac{1}{6}=\frac{n}{6}$ and $ V[X_n] = n.\frac{1}{6}\frac{5}{6}=\frac{5n}{36}$