Explanation of probabilities in discrete Markov chain process

35 Views Asked by At

Let say, we have a Marcov chain process (discrete) denoted by $\left[X_t\right], t=0,1,2,...$.

This Marcov chain has 4 different states, the 4th state being absorbing state. The point/marginal Transitional probability matrix at each time point is denoted by $\left[M_t\right],t=0,1,2,...$

Now let we define the cumulative Transitional probability matrix as $\left(C_0=M_0\right), \left(C_1 = M_0 M_1\right), \left(C_2 = M_0 M_1 M_2\right)$ and so on.

Finally, we define the last column of each $C_t$ as $c_{t,last-col}$. Note that these last columns hold the probabilities of ending up in the absorbing state from each initial state.

My question is now, what is the significance of the expression $\left(c_{t,last-col}-c_{t-1,last-col}\right), t>1$?

Is that expression somewhat related with conditional probability of being absorbed at time $t$, given that it did survive until $t-1$? If this is the case then how?

Any insight will be very helpful.

1

There are 1 best solutions below

0
On BEST ANSWER

As you say, $c_{t,last-col}$ is a column vector, in which the $k$th entry tells $P[\text{get absorbed within} \le t \text{ steps } | \text{ start from state }k]$.

Therefore, $\left(c_{t,last-col}-c_{t-1,last-col}\right)$ is another column vector, whose entries tell $P[\text{absorbed in} \le t \text{ steps}] - P[\text{absorbed in} \le t-1 \text{ steps}] = P[\text{absorbed on step } t \text{ exactly}]$. In other words, the $k$th entry of $\left(c_{t,last-col}-c_{t-1,last-col}\right)$ is the probability that the Markov chain arrives in the absorbing state for the first time on step $t$, assuming that it began in state $k$ at time $0$.