Partition Theorem and Markov Chains

278 Views Asked by At

Suppose a Markov chain has $s$ states, $S = {1, 2, . . . , s}$, with PTM $P =$ ($p_{ij}$). That is, $p_{ij} = P[X_{n+1} = j | X_n = i]$. Use the Partition Theorem to verify that if $X_n ∼ ν$, then $X_{n+1} ∼ νP$. Here $ν = (ν_1, . . . , ν_s)$ is a probability distribution on $S$.

What is this question asking me to do?