Is the sum of any column of transition matrix of Markov chain also equal to 1?

5k Views Asked by At

chapter 07 of the book "Introduction to Probability, 2nd Edition" introduces this figure

enter image description here

the red rectangle is added by myself.

which take i = 1 for this formula

$\sum_{j=1}^m p_{ij} = 1, \quad \text{for all i}$

and gets

$\sum_{j=1}^m p_{1j} = 1$

the interpretation of which could be, current is at state 1, the probability that the next state is equal to j (1, 2, ..., m)

p11 represents from state 1 (time instant_1) to state 1 (time instant_2)

p12 represents from state 1 (time instant_1) to state 2 (time instant_2)

...

p1m represents from state 1 (time instant_1) to state m (time instant_2)

the sum of all the above p11+p12+...+p1m = 1

the question is how about the column one?

I guess p11+p21+...+pm1 is also equal to 1, is it?

1

There are 1 best solutions below

0
On BEST ANSWER

Keep reading, you'll find this example in the book "Introduction to Probability, 2nd Edition"

"Alice is taking a probability class and in each week, she can be either up-to-date or she may have fallen behind. If she is up-to-date in a given week, the probability that she will be up-to-date (or behind) in the next week is 0.8 (or 0.2, respectively). If she is behind in the given week, the probability that she will be up-to-date (or behind) in the next week is 0.6 (or 0.4, respectively). We assume that these probabilities do not depend on whether she was up-to-date or behind in previous weeks, so the problem has the typical Markov chain character (the future depends on the past only through the present)..."

And the key is here

"... let us introduce states 1 and 2, and identify them with being up-to-date and behind, respectively..." then the transition probability Matrix is

$$ \begin{bmatrix} 0.8 & 0.2 \\ 0.6 & 0.4 \\ \end{bmatrix} $$

The sum of each row is 1, whereas the sum of column is not.