Markov chain, confused items in the definition

90 Views Asked by At

I do not understand this sentence about Markov chains (search for the definitions of the notions as given at the end of the question):

In either case (homogeneous or non-homogeneous) the stochastic process may, or may not, be stationary. If it is stationary, its evolution may change over time (non-homogeneous) but this evolution will be the same irrespective of when the process was initiated.

How is it possible that "it may change over time but this evolution will be the same",how can both conditions hold at the same time??

I.e. what is an example of a Markov process ,such that for all $\alpha$, and for all $n$ and all $t_i$ and $x_i$ with $i = 1, 2,..., n$: $$\mathrm{Prob}\{X(t_1) ≤ x_1, X(t_2) ≤ x_2,...,X(t_n) ≤ x_n\}=$$ $$\mathrm{Prob}\{X(t_1 + α) ≤ x_1, X(t_2 + α) ≤ x_2,..., X(t_n + α) ≤ x_n\}?$$

Type this whole sentence ending with 194 (to find appropriate page number):

"If it is stationary, its evolution may change over time (nonhomogeneous) but this evolution will be the same irrespective of when the process was initiated" 194

into google.

1

There are 1 best solutions below

3
On

It is a somewhat confusing passage. The general intent, I think, is that the dynamic rules of the system (e.g., the transition rates) may change over time—that is, it is non-homogeneous—but if the system happens to be stationary at the present time, it doesn't matter whether it was started just now, or at some time in the distant past. The equilibrium solution will progress the same way in either case.