Consider a two-dimensional Markov chain. Let's call the first dimension "Level" and the second dimension "Phase". The state space is $(\ell, p)$ such that $\ell \geq 0$ and $0 \leq p \leq h$. The system can move in feasible directions with rates specified in the figure below. Note that $\lambda_1$, $\lambda_2$, and $s_1$ do not change by level, but $s_2$ increases when $\ell$ increases. Let's call this System 1.
Assume we truncate System 1 at Level $\ell'$ and create System 2. That is, the state space of System 2 is $(\ell, p)$ where $ 0 \leq \ell \leq \ell'$ and $0 \leq p \leq h$.
For $a' \geq a$, define $\Pr((a,b) \rightarrow (a',b')) \equiv $ The prob. of visiting $(a',b')$ given that the system is currently in $(a,b)$ and it never visits any states in levels $a-1$ and below.
Can we claim that the transition probability $\Pr((a,b) \rightarrow (a',b'))$ between the two states $(a,b)$ and $(a',b')$ in System 2 is larger than, or equal to, the corresponding transition probability in System 1?
