Proving a limit of a Markov process

53 Views Asked by At

Let $\{X_{t}\}$ be a continuous time Markov process with a phase space consisting of only two states from the set $S = \{1, 2 \}$. Let $1 - p_{ii}(t) = q_{i}(t) + o(t)$ as $t \rightarrow 0$, where $i \in S$ and $0 < q_{i} < \infty$.

$1)$ Prove that $\{X_{t}\}$ has a unique stationary distribution and find it.

$2)$ Let $(r_{1}(0), r_{2}(0))$ be the intial distribution of the process $r_{i}(0) = P\{X_{0} = i\}$, $i \in S$; $r_{i}(t) = P\{X_{t} = i\}$, $t \geq 0$. Prove $$\lim_{t\to\infty} r_{j}(t) = \overline{r}_{j},$$ where $j \in S$

I think I am doing $(1)$ correctly. I am setting up a matrix system of two equations, whose solution is always unique. So, we can conclude it has a unique stationary distribution.

I have no idea how to approach $(2)$, though. Can someone please help me?