Sum of $n$ coin flips, where the probability of heads at experiment $t$ is dependent on the outcomes of the experiment at $t-1$

55 Views Asked by At

Suppose I play a game of $n$ coin flips, where heads is $1$ and tails is $0$. If each coin flip was independent, the expected sum of all $n$ coin flips is trivial. What if there is dependence? How can this be solved?

A general solution (if any) would be best, but let's construct a precise game just for simplicity. Let $\{X_1, X_2, \dots, X_n\}$ be a sequence of $n$ coin flips, where:

$$ \begin{cases} P(X_t = 1) = \ (P(X_{t-1} = 1))^2\text{ IF } X_{t-1} = 1; \text{ ELSE } P(X_t = 1) = \frac{1}{2}P(X_{t-1} = 1)\\ P(X_t = 0) = 1 - P(X_t = 1) \end{cases} $$ with base case:

$$ \begin{cases} P(X_1 = 1) = p\\ P(X_1 = 0) = 1 - p \end{cases} $$

In other words, the coin flip at time $t$ has two different probabilities of being heads. It is heads with half the probability of being heads at time $t-1$ if it was tails at $t-1$. On the other hand, if it was heads at $t-1$, then you take the squared value of the probability of heads at $t-1$.

How does one go about reasoning about this types of stochastic fields?