Why is this model not a Markov chain?

414 Views Asked by At

I'm new to probability models.

Taken from Ross' Introduction to Probability Models 11th ed:

"Example 4.4 (Transforming a Process into a Markov Chain)

Suppose that whether or not it rains today depends on previous weather conditions through the last two days. Specifically, suppose that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today, then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2. If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (why not?)."

I know that a Markov chain as defined in the book must satisfy the Markovian property, i.e. given the present, the future is independent of the past, and that there is a fixed prob. of going from state i to state j, but I can't seem to connect this to the example. Any help?

1

There are 1 best solutions below

0
On

My (possible) interpretation is that, while it is perfectly possible to express the chain by a $4 \times 4$ matrix relating $4$-D vectors representing the status of two consecutive days $(rain \& rain, rain \& not-rain, \cdots)$, you cannot represent it by $2$-D vectors relevant only to the state of a single day.