Time between Visits

69 Views Asked by At

This is a problem from [1] that I am struggling quite a bit with:

Given an irreducible and positive recurrent DTMC, let $m_{ij}$ denote the mean number of steps to get from state i to state j. Either prove or disprove the following $$m_{jj} ≤ m_{ji} + m_{ij}$$

What I tried so far:

I proved that the probability of hitting state j starting from state i, $f_{ij}=1$ for any irreducible DTMC. So is it a pmf and $m_{ij} = \mathbb{E}[\text{hitting time}]$(We can probably apply some sort of renewal theory result on i-j). I also proved that this is true for a 2 state DTMC(Gilbert Elliot model), but I couldn't prove/find a counter-example for a more general case.

I also know that

$$m_{ii} \leq 1+(1-p_{ii})m_{ji} \leq m_{ij} + m_{ji}$$

for $$j=\max_{k\neq i}\{m_{ki}\}$$

But again, nothing for a general j.

[1] Harchol-Balter, Mor, Performance modeling and design of computer systems. Queueing theory in action, Cambridge: Cambridge University Press (ISBN 978-1-107-02750-3/hbk; 978-1-139-60396-6/ebook). xxiii, 548 p. (2013). ZBL1282.68007.

1

There are 1 best solutions below

1
On BEST ANSWER

A start: Let $R_k$ be the first return time to state $k$. Then, starting in state $j$, $$ R_j=\min(R_i,R_j)+1_{\{R_i<R_j\}}\cdot (R_j-R_i). $$ On the event $\{R_i<R_j\}$, the difference $R_j-R_i$ has the distribution of the first return to $j$ for the chain started in $i$. That is, by the strong Markov property, $$ E^j[1_{\{R_i<R_j\}}\cdot (R_j-R_i)]= P^j[R_i<R_j]\cdot m_{ij}. $$ (The superscript on $E^j$ indicates the initial state.)