Markov Chain: Steady State Distribution.

340 Views Asked by At

A total of $M$ balls are divided between two urns A and B. A ball is chosen uniformly at random. If it is chosen from urn A then it is placed in urn B with probability $b$ and otherwise it is returned to urn A. Similarly, if the ball is chosen from urn B then it is placed in urn A with probability $a$ and otherwise it is returned to urn B. Let $X_n$ denote the number of balls in urn A after the nth trial. Determine the steady state distribution by "Guessing-and-Verify".

Attempt:

The transition probability is $P_{i,i+1}=(1-i/M)b$,
$P_{i,i-1}=(i/M)a$,

$P_{i,i}=1-P_{i,i-1}-P_{i,i+1}$

I have problem guessing the the steady state distribution. My attempt is that at steady state, each balls will be independent, $Pr(X_n=i)=$$M\choose i$$a^ib^{M-i}$.

But the problem is that $b\neq1-a.$ So it might not fulfill the requirement $\sum_{i}\pi_i=1$.

(Surprisingly, it does fulfill that $\pi_iP_{i,j}=\pi_jP_{j,i}$ requirement.)