A question about random walk in Markov chain

97 Views Asked by At

Would someone please explain me what does the highlighted sentence mean? I don't understand what really "even" and "odd" mean here? I would be appreciate if you can help me in a demonstrated picture.

enter image description here

1

There are 1 best solutions below

0
On

To see this intuitively, take some examples: Start from $0$ and try to return to it. Can you do it in $1$ step? No, you need at least $2$. Can you do it in $3$ steps? Generalize this and you will see that you need an even number of steps to return to $0$ starting from it.


This would be different, if the chain could remain in a state with positive probability. But here, you have to move in each step to another state.