How do I say that an infinite-state Markov chain is positive recurrent?

1.2k Views Asked by At

I run into this Markov chain while I'm doing my research, and I can't figure out how to find the condition under which this Markov chain is positive recurrent.

This is a brief scenario of my research:

There is a buffer. This buffer can keep 'unresolved symbols', and they are taken away from the buffer when they are resolved.

At any discrete time instant,

With probability $p^2$, X number of unresolved symbols are saved in the buffer.

With probability $2p(1-p)$, Y number of unresolved symbols in the buffer are resolved. In other words, they are taken away from the buffer.

With probability $(1-p)^2$, Z number of unresolved symbols in the buffer are resolved.

If Y, Z are greater than the number of unresolved symbols in the buffer at a given time, all the unresolved symbols are taken away from the buffer.

So, here I have a specific example: X = 1, Y = 1, Z = 3.

In the Markov chain that I have for this example, the transition probabilities are as follows:

For i = 0, 1, ...

From state i to state i+1: $p^2$

For i = 2, 3, ...

From state i to state i-1: $2p(1−p)$

For i = 3, 4, ...

From state i to state i-3: $(1−p)^2$

Other than these transition probabilities,

From state 1 to 0: $2p(1-p) + (1-p)^2$

From state 2 to 0: $(1−p)^2$

All other transition probabilities : self-transitions.

There transition probabilities will be different depending on the values of X, Y, and Z.

I'd like to come up with an easy way to prove if a given infinite-state Markov chain is positive recurrent or not. I know that there is a theorem saying that if there is a solution to the following, the Markov chain is positive recurrent, and vice versa:

$\sum_{i=0}^{\infty} \pi_{i} = 1$

$\pi_i = \sum_{j} \pi_j p_{ij}$

where $\pi_i$ is the probability of state $i$, $p_{ij}$ is the transition probability from state $j$ to $i$.

But finding a solution every time I have an infinite-state Markov chain doesn't seem to be a good idea to me in that I have variables X, Y, and Z.

So if there is a way to easily check if a given Markov chain is positive recurrent, it will be extremely handy.

Thanks!

P.S. Sorry that I've been very unclear in asking this question. I've learned a lot since I posted this. Even I wouldn't understand my question if I read this question without knowing what's really behind it. Thanks all. Really appreciate it.

1

There are 1 best solutions below

6
On BEST ANSWER

For i = 0, 1, ... From state i to state i+1: $p^2$ For i = 1, 2, ... From state i to state i-1: $2p(1-p)$ For i = 3, 4, ... From state i to state i-3: $(1-p)^2$ Other than these transition probabilities, From state 1 to 0, 2 to 0: $(1-p)^2$ All other transition probabilities : self-transitions.

A stationary distribution $(\pi_i)$, if it exists, must be such, at least for $i\geqslant3$, $$\pi_i=p^2\pi_{i-1}+2p(1-p)\pi_{i+1}+(1-p)^2\pi_{i+3}.$$ These are not the identities in your post.

If the question is to solve the identities every stationary distribution must satisfy, I suggest to start from the correct ones above, to add the ones corresponding to "small" values of $i$, and to proceed.

If the question is to determine when the Markov chain is positive recurrent, note that the drift of the chain at every $i\geqslant3$ is $$1\cdot p^2+(-1)\cdot2p(1-p)+(-3)\cdot(1-p)^2=4p-3.$$ Thus, the Markov chain is positive recurrent if $p\lt\frac34$ and transient if $p\gt\frac34$.