Absorbing state for a collection of random walks

467 Views Asked by At

Further to this question; having learned some stuff since I posed it.

Consider a collection of random walks $X_i$ which take finite integer values. These evolve as time-inhomogeneous Markov Chains.

Consider $Y=\sum X_i$; this is clearly a (more complex) random walk: the "Super Walk"!

If we set any states of $Y$ where $Y<y$ as absorbing states, I feel that this could be modeled by creating an absorbing state in each $X_i$ and transitioning to this state based on the status of the Super Walk.

How would the transition matrices work for this?

To give a concrete (and simple) example:

Let $X_1=X_2$ have three states 0, 1 and 2 with a $ transition matrix of:

$$P_X=\begin{bmatrix} 0.5 &0.5&0\\ 0.25&0.5&0.25\\ 0&0.5&0.5\\ \end{bmatrix}$$

$Y$ then has the states 0,$\dots$,9 and even for this simple example its transition matrix, viz. is rather complex, viz:

$$P_Y=\begin{bmatrix} 0.25 &0.25 &0 &0.25&0.25&0&0&0&0\\ 0.125&0.25&0.125&0.125&0.25&0.125&0&0&0\\ 0&0.25&0.25&0&0.25&0.25&0&0&0\\ 0.125&0.125&0&0.25 &0.25&0&0.125&0.125&0\\ 0.065&0.125&0.065&0.125&0.25 &0.125&0.065&0.125&0.065\\ 0&0.125&0.125&0&0.25&0.25 &0&.125&0.125\\ 0&0&0&0.25&0.25&0&0.25 &0.25&0\\ 0&0&0&0.125&0.25&0.125&0.125&0.25 &0.125\\ 0&0&0&0&0.25&0.25&0 &0.25&0.25\\ \end{bmatrix}$$

If we now define, $Y<2$ as absorbing states, the transition matrix becomes:

$$P_{Y'}=\begin{bmatrix} 1 &0 &0 &0&0&0&0&0&0\\ 0&1&0&0&0&0&0&0&0\\ 0&0.25&0.25&0&0.25&0.25&0&0&0\\ 0&0&0&1&0&0&0&0&0\\ 0.065&0.125&0.065&0.125&0.25 &0.125&0.065&0.125&0.065\\ 0&0.125&0.125&0&0.25&0.25 &0&.125&0.125\\ 0&0&0&0.25&0.25&0&0.25 &0.25&0\\ 0&0&0&0.125&0.25&0.125&0.125&0.25 &0.125\\ 0&0&0&0&0.25&0.25&0 &0.25&0.25\\ \end{bmatrix}$$

How can I determine the transition matrix for X (with the first state being the absorbing one)?

$$P_{X'}=\begin{bmatrix} 1&0&0&0\\ ?&?&?&?\\ ?&?&?&?\\ ?&?&?&?\\ \end{bmatrix}$$

Thanks to @Did, whose answer has made me realize that while $Y$ is a Markov Chain, these revised $X_i$ are not Markov Chains because their transition does not depend on their own state only. I now realize that I will have to consider the "Super Walk" as the fundamental entity.

1

There are 1 best solutions below

3
On BEST ANSWER

Consider $Y=\sum X_i$; this is clearly a (more complex) random walk

Well, no...

In general, and in particular in the example you suggest, the fact that the processes $X_1=(X_1(t))_t$ and $X_2=(X_2(t))_t$ are independent real valued Markov chains does not imply that the process $Y=(Y(t))_t$ defined by $Y(t)=X_1(t)+X_2(t)$, is a Markov chain.

In your case, consider the event $[Y(t)=2]$. Depending on whether $[X_1(t)=X_2(t)=1]$ or $[X_1(t)=2,X_2(t)=0]$, the possible states for $Y(t+1)$ are $\{0,1,2,3,4\}$ or $\{1,2,3\}$. Hence, $Y=(Y(t))_t$ is not a Markov chain.