Strong markov property vs usual markov property.

1.4k Views Asked by At

I was trying to understand the difference between strong Markov property and the usual Markov property for a discrete number of states. I think I understand why the strong Markov property implies the usual one : We have to consider a deterministic stopping time $T(\omega)=t_0$, right?
But what would be a simple example (like coin toss, dice toss,...) where we have the Markov property but not the strong Markov property?

1

There are 1 best solutions below

2
On BEST ANSWER

Any discrete time Markov chain has the strong Markov property; the same holds for a continuous time chain with a stopping time that only takes a countable set of values. This is actually not too bad to prove, just conditioning on the particular realisation of the stopping time. You can read about the strong Markov property in James Norris' book, in particular Section 1.4 -- this is for discrete time. As you point out, strong Markov implies Markov by just taking the stopping time deterministic. So you can't come up with a chain that has one but not the other :)

For continuous time, it is more subtle. These lecture notes give a counterexample -- Example 167. (The full lecture notes can be found here.)