Expected number of coin side changes in a sequence of coin tosses with unfair coin

1.8k Views Asked by At

Suppose with have an unfair coin with probability p for heads and 1-p for tails. In a series of coin tosses(like n times) what is the expected number of times that the coin side changes? for instance if we toss the coin 5 times and the following sequence comes: $$THHHT$$ Then the coin side has changed 2 times ( from tails to heads at the beginning and from head to tail at the end)
If $X$ is a random variable of number of side changes, we need $E[x]$. How ever I'm really struggling to find the probability of a side change. I tired to use conditionals but no luck.

2

There are 2 best solutions below

0
On BEST ANSWER

Suppose we flipped the coin $n$ times.

In order to help organize our thoughts, let's define several random variables. (With more practice, you can easily skip defining so many random variables, but I think it will be beneficial to help explain what is going on).

Let $H_1,H_2,H_3,\dots,H_n$ be the indicator random variable that takes value $1$ if the corresponding coin flip was heads and $0$ otherwise.

Let $T_1,T_2,T_3,\dots,T_n$ be the indicator random variable that takes value $1$ if the corresponding coin flip was tails and $0$ otherwise. (Note: $T_i = 1 - H_i$)

Let $X_1,X_2,X_3,\dots,X_{n-1}$ be the indicator random variable that takes value $1$ if there was a "coin side change" that occurred from the corresponding numbered coin to the next. (Note: $X_i = H_iT_{i+1}+T_iH_{i+1}$) (Note also: we stop here at $n-1$ because there is no coin after the $n$'th that we could change to)

Finally, let $X = X_1+X_2+\dots+X_{n-1}$. Recognize that $X$ is precisely the random variable counting the total number of side changes that we were asked to find the expected value of.


Now... by our convenient choices of random variables we have $$E[X] = E[X_1+X_2+X_3+\dots+X_{n-1}]$$

Then, from the linearity of expectation this continues further as $$\dots = E[X_1]+E[X_2]+\dots+E[X_{n-1}]$$

Now... again, by our convenient choices of random variables, this expands further as

$$\dots = E[H_1T_2+T_1H_2]+E[H_2T_3+T_2H_3]+\dots+E[H_{n-1}T_n+T_{n-1}H_n]$$

Which by linearity of expectation again and properties of independent random variables

$$\dots = E[H_1]E[T_2]+E[T_1]E[H_2]+E[H_2]E[T_3]+E[T_2]E[H_3]+\dots+E[T_{n-1}]E[H_n]$$

Finally, since the expected value of an indicator random variable is precisely the probability of said indicator random variable as having occurred, this all simplifies to:

$$\dots = p(1-p)+(1-p)p+p(1-p)+(1-p)p+\dots+p(1-p)+(1-p)p$$

and collecting like terms and noting how many occurrences of each there were simplifies to the final answer of:

$$E[X] = 2(n-1)p(1-p)$$

It is worth emphasizing that $X_i,X_j$ might not be independent of each other, but the strength of linearity of expectation is that that doesn't matter! Addition inside of expected value can be turned into addition outside of expected value, regardless of the dependence or independence of the respective random variables involved. The only events that we cared about the independence of was the individual results of the coinflips which by the very nature of what coin flips are we know to be independent (or more accurately, we always model the situation mathematically as to assume that they are).

2
On

The probability that there is change from $Heads$ to $Tails$ is $P[Heads]*P[Tails]$ and the probability it will change from a $Tails$ to a $Heads$ is $P[Tails]*P[Heads]$.

So the expected value it will change at any given iteration (that isn't the first) is the sum of these two probabilities:

$\mathbb{E}(switch) = p(1-p)+(1-p)p = 2p(1-p)$.

Now knowing the fact that expected value is linear in the number of trials, the number of switches from $Heads$ to $Tails$ or $Tails$ to $Heads$ when flipping the coin $n$ times is $2p(1-p)(n-1)$.