(Solved)Tossing n coins independently, but with different probabilities of heads

93 Views Asked by At

Question statement:

For each n ≥ 1 you are given a coin that lands heads with probability pn. The coins are tossed independently. Let Xm denote the number of coins among the first m coins that land heads. m ≥ 1. (a) Write down the probability mass function of $ X_3 $. (b) Find E[$ X_m $].

What I've tried:

I tried to write down the probability expression for h heads within the first m coins. $ P(X_m = h) $. I express it as a product $ ^{m}P_{H} \prod (p_h) \prod (1 - p_{h'}) $ where h is in a set H containing the indexes of all those coin throws that returned heads, and h' is not in H. Seems fine to me (I guess?).

Where I run into a problem is when I have to compute it for X3. Say I'm computing P(X3 = 1), there are three possibilities of the coin being heads (on the 1st, 2nd or 3rd try). To account for all three of these possibilities I'd have to add up the probabilities of all the three cases (since they're disjoint), correct? How do I reconcile the abstract formula I made up with the case of a given h?

Or is it possible that I'd have to modify my formula for an arbitrary h?

P.S: I have a sneaking suspicion that I'd have to do the summing process over even in the arbitrary h case since I don't know preemptively which of the n throws returns a head.

Edit: I figured out a way to solve this problem without ever having to figure out P($ X_3 = h$) as an explicit formula. Wrote $ X_m $ as the sum of the indicators of each toss and distributed the expectation function over those indicators.

1

There are 1 best solutions below

1
On

For part (a) we have $$ \mathbb P(X_3 = k) = \begin{cases} \prod_{i=1}^3 (1-p_i),& k=0 \\ p_1(1-p_2)(1-p_3)+p_2(1-p_1)(1-p_3)+p_3(1-p_1)(1-p_2),& k=1\\ p_1p_2(1-p_3) + p_1p_3(1-p_2)+p_2p_3(1-p_1),& k=2\\ \prod_{i=1}^3 p_i,& k=3. \end{cases} $$

For part (b) we have $X_m=\sum_{i=1}^m Y_i$ where $Y_i\sim\mathrm{Ber}(p_i)$ are independent. Hence $$ \mathbb E[X_n] = \mathbb E\left[\sum_{i=1}^m Y_i\right] = \sum_{i=1}^m\mathbb E[Y_i] = \sum_{i=1}^m p_i. $$