Markov chains steady-state distribution

1.4k Views Asked by At

Ok so we are given a Markov chain $X_n$, $P=P(ij)$ as the transition matrix and the $(\pi_1,\pi_2,\pi_3,...,\pi_n)$ as steady-state distribution of the chain. We are asked to prove that for every $i$:

$\sum_{i\neq j} \pi_iPij = \sum_{j\neq i} \pi_jPji$

Can somebody explain me what that means and why it is (obviously true for every steady -state distribution? Plus, how do we prove it??

2

There are 2 best solutions below

0
On BEST ANSWER

I think you need the condition that the Markov Chain is reversible.


Def: Let $X$ be an irreducible Markov chain such that $X_n$ has the stationary distribution $\mathbb{\pi}$ for all $n$. The chain is called reversible if the transition matrices of $X$ and its time-reversal $Y$ are the same, which is to say that $$\pi_iPij = \pi_jPji \:\:\:\: \forall i,j$$

Thm: Let $P$ be the transition matrix of an irreducible chain $X$, and suppose that there exists a distribution $\mathbb{\pi}$ such that $$\pi_iPij = \pi_jPji \:\:\:\: \forall i,j$$ Then $\mathbb{\pi}$ is a stationary distribution of the chain. Furthermore, $X$ is reversible in equilibrium.


Suppose that $\mathbb{\pi}$ satisfies these conditions, then

$$\sum_{i} \pi_iPij = \sum_{i} \pi_jPji = \pi_j \sum_{i} Pji = \pi_j $$

0
On

The other answer is incorrect in saying that you need detailed balance. Intuitively, the equation in the OP says that the probability going into state $i$ is equal to the probability going out of state $i$, when the distribution is the stationary distribution. But the difference between these two quantities is $(\pi P - \pi)_i$, which is zero by the definition of $\pi$. (In my notation, $P$ is the transition probability matrix and is taken to be row-stochastic, and so $\pi$ is a row vector.)