Definition of ergodicity for Markov process

478 Views Asked by At

According to [1]:

A Markov process is ergodic if there exists a unique invariant probability distribution and, for any state $x$, the transition probabilities $P(X_t ∈ ·|X_0 = x)$ converge to that distribution in total variation.

Does this notion of ergodicity have the same implication as normal ergodicity (i.e. time and ensemble statistics converge as $t \to \infty)$? Or is this notion of ergodicity something completely different?

  1. Borovkov, K., and A. Novikov. "On a piece-wise deterministic Markov process model." Statistics & probability letters 53.4 (2001): 421-428.
1

There are 1 best solutions below

1
On BEST ANSWER

It's the same. Specifically, it is equivalent to say that time averages of suitable test functions converge to ensemble averages of those test functions. Note that in the probability context, "ensemble average" means expectation, rather than integration over phase space as in the dynamical systems context.