Understanding the martingale property

303 Views Asked by At

My lecture notes says that "The idea of the martingale property is that, on average, the Markov chain stays where it is and for this to be true, the chain must stay where it is all the time (i.e. be in an absorbing state) or be able to move in both directions. This shows that, for a martingale on the state space $$S = \{0, \dots, d\},$$ the states $0$ and $d$ must be absorbing."

Intuitively, I understand why the states $0$ and $d$ must be absorbing.

However, my lecture notes then goes on to say that "For a more formal demonstration, note that the martingale property shows that $$\sum^d_{y = 0} yP(0, y) = 0$$ so that $$P(0, 1) = P(0, 2) = \dots = P(0, d - 1) = P(0, d) = 0$$ and we see that state $0$ is absorbing. A similar argument shows that state $d$ is absorbing."

This is the part that I do not quite understand yet.

Thus, my question is, to show that state $d$ is absorbing, I can get from $$\sum^d_{y = 0} yP(d, y) = d$$ to $$P(d, 1) + 2P(d, 2) + \dots + (d - 1)P(d, d - 1) + dP(d, d) = d,$$ but how do I conclude that $$P(d, d - 1) = P(d, d - 2) = P(d, 1) = P(d, 0) = 0?$$

I am only taking an introductory module in stochastic processes, so any explanations to the mathematical proof will be greatly appreciated :)

1

There are 1 best solutions below

2
On

If we're in state $d$, we can only stay in $d$ or transition to some value less than $d$ - that is to say, the probability of increase is 0. But this is a martingale, so our expected movement must be zero - it follows trivially that the probability of decrease must also be zero, and so the state is absorbing.