Upper Bound of Markov Chain Convergence?

281 Views Asked by At

Reading about Markov Chain Monte Carlo in this book on Probability (DeGroot), it says

In general, the distribution will get pretty close to the stationary distribution in finite time, but how do we tell, in a particular application, if we have sampled the Markov chain long enough to be confident that we are sampling from something close to the stationary distribution? Much work has been done to address this question, but there is no foolproof method.

Assuming all the transition probabilities encoded in a matrix, it looks possible to know that the stationary distribution has been reached by solving $P\mathbf{x} = \mathbf{x}$, but then there is the question of whether doing so is tractable.

What is the upper bound on either the

  • iterations, or
  • computational complexity

to determine that a \Markov Chain, encoded in a transition matrix, has converged to a steady-state, stationary probability distribution?

1

There are 1 best solutions below

0
On

it looks possible to know that the stationary distribution has been reached by solving $P\mathbf{x} = \mathbf{x}$, but then there is the question of whether doing so is tractable.

No. The main problem is that one does not know $P$ hence "solving $P\mathbf{x} = \mathbf{x}$" is irrelevant.