Given two coins and transition matrix between them given by:
$\begin{bmatrix} 1-\alpha&\alpha \\ \beta&1-\beta \end{bmatrix}$
Where the first coin has probability of heads $p$ and tails $1-p$ and the second coin has probability of heads $q$ and tails $1-q$. Suppose I'm equally likely to start at either coin and the procedure is I flip the coin I'm at and then apply the transition matrix and then repeat. How do I go about finding the probability of a given string of heads and tails of length $n$?
I can draw like a branching diagram but beyond a couple flips it gets completely out of hand. I know there's probably a formula for this, can anyone tell me what it is? Thanks.
Also, I've read now several places online which say that the probability that a given sequence of observations $O$ is generated by a particular state $\lambda$ is given by $P(O\;|\;\lambda)$, but it shouldn't it be $P(\lambda\;|\;O)$?
Let $w$ denote any word and $x$ any letter in the alphabet $\{h,t\}$, $p_i(w)$ the probability to observe $w$ and that the last coin used was $i$, and $e_i(x)$ the probability that coin $i$ produces $x$. The probability to observe $w$ is $p_1(w)+p_2(w)$. Furthermore, $$p_1(wx)=[p_1(w)(1-\alpha)+p_2(w)\beta]e_1(x),\quad p_2(wx)=[p_1(w)\alpha+p_2(w)(1-\beta)]e_2(x). $$ The parameters $e_1(x)$ and $e_2(x)$ for $x$ in $\{h,t\}$ are known. The parameters $\alpha$ and $\beta$ are known. The initial conditions are $p_1(x)=\frac12e_1(x)$ and $p_2(x)=\frac12e_2(x)$.
Hence, for a word $W$ of length $N$, after $2N$ loops, storing $4$ fixed parameters $e_1(h)$, $e_2(h)$, $\alpha$ and $\beta$, and $2$ running values $p_1(w)$ and $p_2(w)$ for the initial words $w$ of $W$, one gets $p_1(W)$ and $p_2(W)$, hence the probability $p_1(W)+p_2(W)$ to observe $W$.