For words problems, I'm having trouble defining what state should be in a Markov chain, when the states aren't that obvious. Are there any general procedures that should be followed in order to identify what a state should be?
An example I'm having trouble with would be something like this where to me the states aren't obvious.

Here the states of the markov chain are the types of exams (so there are three states). You can tell cause each time we transfer to a new exam according to some probability rule depending on the exam we just had.