On the definition of Markov chains

340 Views Asked by At

A Markov chain with discrete time dependence and stationary transition probabilities is defined as follows. Let $S$ be a countable set, $p_{ij}$ be a nonnegative number for each $i,j\in S$ and assume that these numbers satisfy $\sum_{j\in S}p_{ij}=1$ for each $i$. A sequence $\{X_j\}$ of random variables with ranges in $S$ is defined to be a Markov chain if $$ P[X_{n+1}=j|X_0=i_0,\ldots,X_n=i_n]=p_{i_nj}\quad(1) $$ for every $n$ and every sequence $i_0,\ldots,i_n$ in $S$ for which $P[X_0=i_0,\ldots,X_n=i_n]>0$.

The property $$ P[X_{n+1}=j|X_n=i]=p_{ij}\quad(2), $$ is a consequence of the definition (1), but I don't think that (2) implies (1). So are there any interesting examples which satisfy (2) but are not Markov chains?

(Of course (1) determines $P[X_{n}=i_n,X_{n-1}=i_{n-1},\ldots,X_{n-k}=i_{n-k}]$ in terms of the transition probabilities and the initial probabilities for any $n$ and all $k\leq n$, whereas (2) does so for any $n$ and for $k=0,1$.)

2

There are 2 best solutions below

0
On BEST ANSWER

One can consider primitives of stochastic processes, in the following sense. Assume that $(U_n)$ is i.i.d. and define $(X_n)$ and $(S_n)$ recursively by $X_0=S_0=0$ and, for every $n\geqslant0$, $$X_{n+1}=X_n+U_{n+1},\qquad S_{n+1}=S_n+X_{n+1}. $$ Then, the process $(U_n)$ is i.i.d. by definition (hence a Markov chain), the process $(X_n)$ is a (homogenous) Markov chain with transition probabilities $p_{i,j}=P(U_1=j-i)$ for every states $(i,j)$, but, in general, the process $(S_n)$ is not a Markov chain.

For example, assume that every $U_n$ is uniform on $\{-1,1\}$, then $(X_n)$ is the standard random walk on $\mathbb Z$ with transitions $p_{i,i+1}=p_{i,i-1}=\frac12$ for every $i$ in $\mathbb Z$. Using the notation $\xi_{i:j}$ for $(\xi_i,\xi_{i+1},\ldots,\xi_j)$ for various symbols $\xi$ and integers $i\leqslant j$, one can check that $$P(S_{1:4}=(1,1,0,2))=P(U_{1:4}=(1,-1,-1,3))=0,$$ while $$P(S_{1:4}=(-1,-1,0,2))=P(U_{1:4}=(-1,1,1,1))\ne0,$$ hence the distribution of $S_4$ conditionally on $S_{1:3}$ does not depend only on $S_3$, that is, $(S_n)$ is not a Markov chain.

...However, the process $(S_n)$ is "Markov of order $2$", in the sense that the distribution of $S_{n+2}$ conditionally on $S_{1:n+1}$ depends on $(S_{n},S_{n+1})$ only. Specifically, for every integers $(i,j,k)$, $$P(S_{n+2}=k\mid S_{1:n-1},S_n=i,S_{n+1}=j)=P(U_1=k-2j+i).$$ Iterating the primitivation mechanism described above produces "Markov processes of higher orders".

To get stationary processes, that is, processes $(S_n)$ whose transitions do not depend on time, consider again $(U_n)$ i.i.d. uniform on $\{-1,1\}$ but consider some processes $(X_n)$ and $(S_n)$ on the state space $\mathbb Z/N\mathbb Z$ for some large $N$, defined recursively by $X_0=S_0$ uniform on $\mathbb Z/N\mathbb Z$, and $$X_{n+1}=X_n+U_{n+1}\pmod{N},\qquad S_{n+1}=S_n+X_{n+1}\pmod{N}. $$ Then, every $X_n$ and every $S_n$ is uniform on $\mathbb Z/N\mathbb Z$ and $X_{n+1}$ is independent of $S_n$ (this one demands a little work). In particular, the process $(S_n)$ is not a Markov chain since, for every $(i,j)$ in $\mathbb Z/N\mathbb Z$, $$ P(S_{n+1}=j\mid S_n=i)=\frac1N,\qquad P(S_{n+1}=j\mid S_{n-1}=2i-j,S_n=i)=0.$$

1
On

Any probability distribution at all will satisfy your equation 2. The point of a Markov chain is that in the equation $$ P[X_{n+1}=j|X_0=i_0,\ldots,X_n=i_n]=p_{i_nj}, $$ the probability only depends on the previous element: $$ P[X_{n+1}=j|X_0=i_0,\ldots,X_n=i_n]=P[X_{n+1}=j|X_n=i_n]. $$