Convergence time of a Markov chain

578 Views Asked by At

We know that a regular Markov chains converges to a unique matrix. The convergence time maybe finite or infinite. My interest is in the case where the convergence time is finite. How can we accurately determine this time or in other words the number of transitions for convergence? I am interested to go through the relevant materials needed to determine different methods that maybe out there to calculate this. If anyone could suggest a good reference to start off with I would be delighted. Any assistance will be welcomed.

2

There are 2 best solutions below

3
On BEST ANSWER

Refer the following thesis it will help. What you are looking for can be found from page 16 on-wards. Hope this helps. http://www.math.uni.wroc.pl/~lorek/files/lorek_thesis.pdf

3
On

I'm not sure what you mean by distinguishing finite and infinite convergence time - no interesting chain will converge exactly in finite time, and of course no closeness to the limit before "infinite time" is essentially no convergence.

Anyway, the question you are asking is essentially about how fast powers of stochastic matrices will converge.

I found this lecture where they prove exponential convergence: http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-262-discrete-stochastic-processes-spring-2011/video-lectures/lecture-7-finite-state-markov-chains-the-matrix-approach/