Matrix Form for Hitting Probabilities in a Markov Chain

283 Views Asked by At

I've just read that the vector of hitting probabilities $h^A=\left(h^A_i=P(H^A<\infty|X_0=i):i \in S\right)$ is the minimal solution to the linear system:

$h^A_i=1, i \in A$ and $h^A_i=\sum_{j \in S} p_{ij}h^A_j, i \notin A$

Is there a way to write this in matrix form? And how can we be sure that such a solution exists for this linear system?

Any help would be appreciated.

1

There are 1 best solutions below

5
On BEST ANSWER

Matrix form:

$$ \left[ \begin{array}{cccccc|c} 1&0&0&0&0&0&1\\ 0&1&0&0&0&0&1\\ 0&0&1&0&0&0&1\\ &&&\cdots &&& 1\\ p_{i,0} & p_{i,1} & \cdots & p_{i,i}-1 & p_{i,i+1} & \cdots & 0 \\ p_{i+1,0} & p_{i+1,1} & \cdots & p_{i,i} & p_{i+1,i+1}-1 & \cdots & 0 \\ &&\cdots &&&& 0\\ p_{n,0} & p_{n,1} & \cdots & \cdots & p_{n,n-1} & p_{n,n}-1 & 0 \\ \end{array} \right] $$

To see that the $h_i^A$'s are a solution:

Firstly, if $i\in A$ then $X_0=i\implies H^A=0\implies h_i^A=1$.

Suppose now that $i\notin A$. Conditioning on the first step:

\begin{align} h_i^A &= \sum_{j\in S}P(X_1=j\mid X_0=i) P(H^A\lt\infty|X_1=j,X_0=i) \\ &= \sum_{j\in S}p_{ij} P(H^A\lt\infty|X_0=j) \qquad\text{since, given $X_0=i\notin A$, we know $H_A\neq 0$} \\ &= \sum_{j\in S}p_{ij} h_j^A \qquad\qquad\qquad\qquad\text{(1)} \end{align}

If you want to see why this solution is minimal, take an arbitrary solution $s_1,s_2,\ldots,s_n$.

For $i\in A$ we must have $s_i=1=h_i$ so minimality for such $i$ holds trivially.

Suppose now that $i\notin A$. Then our equation (1) for $s_i$ can be split into separate sums, and then at each step iterate by replacing the $s_j$ values:

\begin{align} s_i &= \sum_{j\in A}p_{ij} + \sum_{j\notin A}p_{ij} s_j \qquad\text{using $s_j=1$ for $j\in A$} \\ &= P(H_A\leq 1\mid X_0=i) + \sum_{j\notin A}p_{ij} s_j \\ &= P(H_A\leq 1\mid X_0=i) + \sum_{j_1\notin A}p_{ij_1} \left(\sum_{j\in A}p_{j_1j} + \sum_{j_2\notin A}p_{j_1 j_2} s_{j_2}\right) \\ &= P(H_A\leq 2\mid X_0=i) + \sum_{j_1\notin A}p_{ij_1} \left(\sum_{j_2\notin A}p_{j_1 j_2} s_{j_2}\right) \\ & \ldots \\ \end{align}

Continuing in this way, we have, for arbitrarily large $M$,

\begin{align} s_i &= P(H_A\leq M\mid X_0=i) \;+\; \text{some non-negative term} \\ \therefore\quad s_i &\geq P(H_A\lt\infty\mid X_0=i) = h_i^A \qquad\text{which proves minimality.} \end{align}