How to define states in Markov Chains?

30 Views Asked by At

For words problems, I'm having trouble defining what state should be in a Markov chain, when the states aren't that obvious. Are there any general procedures that should be followed in order to identify what a state should be?

An example I'm having trouble with would be something like this where to me the states aren't obvious.

enter image description here

1

There are 1 best solutions below

3
On BEST ANSWER

Here the states of the markov chain are the types of exams (so there are three states). You can tell cause each time we transfer to a new exam according to some probability rule depending on the exam we just had.