Dealing with the probability of a state occurring in Markov Chain

91 Views Asked by At

In Markov Chain, when we are dealing with a situation such as:

$X_n$ eventually reaches $C$, where $C$ is a subset of the total possible states.

where $X_n$ is the state at the $n^{th}$ step. A book mentions that if the initial state $i \in C$, it is certain that eventually at some point of time, $X_n$ will eventually reach $C$.

I am not able to understand why this is so.

1

There are 1 best solutions below

0
On BEST ANSWER

The book indeed allows the hitting time to be some time n≥0, which indeed includes the initial time 0. It even uses the same language of being “trivial” that the hitting probability is 1 given the initial state is in C.

Answer by @Michael in the comments.