Hitting times and minimal solution of a linear system

276 Views Asked by At

Consider a random walk on $\{ 1,\ldots,n\}$ which increases by one with probability $q_i$ and decreases with probability $p_i$.

When it reaches one the chain is absorbed and $p_1=1$ forever. At $n$ the chain is reflected with back with probability $p_n$ and stays in $n$ with probability $q_n$.

It's a theorem (Thm 1.3.5 of JR Norris, Markov Chains), that the expected time of absorption for the chain started at $j$ is given by the minimal solution of

$$ \mathbb{E}[\tau_1] = 0, \\ \mathbb{E}[\tau_j] = 1 + p_j\mathbb{E}[\tau_{j-1}] +q_j\mathbb{E}[\tau_{j+1}] , \\ .$$ This is a linear system of equations, in matrix form,

$$ \tau = \boldsymbol{1} + \widetilde{P}\tau $$

where $P$ is the transition matrix describing the walk, and $\widetilde{P}$ is the same matrix with the column and row of the absorbing state deleted, $I_{n-1}$ is the $(n-1)\times(n-1)$ identity matrix and $\boldsymbol{1}$ is a vector of ones.

Hence,

$$ \tau = (I_{n-1} - \widetilde{P})^{-1}\boldsymbol{1} $$.

Here, $(I_{n-1} - \widetilde{P})^{-1}$ is rank deficient and cannot be inverted. I suspect that this is why in Norris the additional condition of a minimal solution is addeded.

Two questions:

  • Is there an algebraic way to impose the minimality requirement?
  • Is $(I_{n-1} - \delta\widetilde{P})^{-1}$, with $0<\delta<1$ a good approximation of the minimal solution for $\delta$ close to one?
  • What about taking the limit $\lim_{\delta\to 1}(1-\delta)(I_{n-1} - \delta\widetilde{P})^{-1}$?
1

There are 1 best solutions below

1
On

We should be able to get a system of linear equations that uniquely specify $\mathbb E[\tau_1], \dots, \mathbb E[\tau_n]$ if we just add in the additional constraint $$ \mathbb E[\tau_n] = 1 + p_n\mathbb E[\tau_n] + q_n\mathbb E[\tau_{n-1}]. $$ On the other hand, I am very skeptical of the "minimal solution" claim. If we consider the system with states $\{1,2,3\}$ and $p_2 = p_3 = q_2 = q_3 = \frac12$, then the complete system of equations (with the constraint above) is $$ \begin{cases} \mathbb E[\tau_1]= 0 \\ \mathbb E[\tau_2] = 1 + \frac12 \mathbb E[\tau_1] + \frac12 \mathbb E[\tau_3] \\ \mathbb E[\tau_3] = 1 + \frac12 \mathbb E[\tau_2] + \frac12 \mathbb E[\tau_3] \end{cases} $$ which has the unique solution $\mathbb E[\tau_1] = 0$, $\mathbb E[\tau_2] = 4$, $\mathbb E[\tau_3] = 6$.

On the other hand, if we drop the last constraint, the only solutions that have any right to be called minimal are $(0, 1, 0)$ and $(0, \frac45, -\frac25)$. Neither of these have any relation to the actual solution.