Why call the states of a Markov chain positive recurrent and null recurrent?

30 Views Asked by At

If a Markov chain is recurrent, its states will all be visited an infinite number of times on average if the chain is run forever. This name makes sense, the event of visiting the state keeps occurring recurrently.

Now recurrent states are further split into two kinds. If the expected number of steps between two visits is finite, its called positive recurrent. And if the expected number of steps is infinite, null recurrent.

First, it's relatively rare for a distribution to be a valid distribution and yet not have a mean. So why come up with a special name for this phenomenon? For instance, I don't think we have any special name for distributions that don't have a first moment. There must be a reason we deemed this phenomenon important enough in the context of Markov chains to bother giving it a name.

Second, it isn't immediately clear why they were named like that. Why "positive" and "null"?