Can two nodes in a Markov chain have transitions that don't total 1?

604 Views Asked by At

In all the Markov diagrams I see, the transitions from state A to B always total to one.

Just one of many examples, this image from this website.

Markov Chain

I've also seen examples that don't have values, but have variables. For example this diagram:

Markov Diagram

What I'm wondering is: if $P_{AB}$ is the only path from $A$ to $B$, does $P_{AB}$ have to be $1$?

1

There are 1 best solutions below

0
On BEST ANSWER

The transition probabilities from one node to the other nodes have to sum up to $1$. This is true even if there is only one path out of a certain state.

In what follows let $A^n$ the $n^{th}$ state and let $A_{i}^{n+1}$ denote the $(n+1)^{st}$ state of the Markov chain. Also, suppose that $P(A^n)>0$. $(i=1,2, ... m)$ Then $$\sum_{1} ^{m}P(A_{i}^{n+1}|A^n)=\sum_{1} ^{m}\frac{P(A_{i}^{n+1}\cap A^n)}{P(A^n)}=\frac{1}{P(A^n)}\sum_{1} ^{m}P(A_{i}^{n+1}\cap A^n)=\frac{P(A^n)}{P(A^n)}=1.$$