Terminology in random walk on a graph

38 Views Asked by At

Consider a finite directed graph with $n>1$ nodes and $2$ edges leaving each node.

In a statistical experiment: starting from node $0$, we chose the next node by fair coin throw, until reaching node $1$. The outcome of the experiment is the number of steps, or $0$ if it is reached a node that can't reach $1$.

What the name for this experiment? For the distribution $p$ of its outcome? For the property that any node reachable from $0$ can reach $1$, giving $p(0)=0$? Or more basically for the property that any node can be reached from any other?

1

There are 1 best solutions below

2
On BEST ANSWER

I'm not sure there's a special word for the experiment, but:

  • The probability of eventually reaching a target state (or set of states) from the initial state is called a hitting probability.
  • The random variable that gives the number of steps is called the hitting time. However, by convention, we say that the hitting time is $\infty$ rather than $0$ if the target state is never reached.

I don't know a specific word for the case where the hitting probability is $1$, but here is some related terminology:

  • When it's possible to get from $i$ to $j$, we say that $j$ is accessible from $i$.
  • When $i$ is accessible from $j$ and vice versa, we say that $i$ and $j$ communicate. This is an equivalence relation, and its equivalence classes are called communicating classes.
  • When any two states communicate, the Markov chain is irreducible.
  • When it's not possible to leave a state, we call that state absorbing. Similarly, a communicating class is called closed if it's not possible to leave it.

If we're looking for the hitting probability, it's common to transform the Markov chain as follows:

  1. Make state 1 (the state we're trying to get to) an absorbing state.
  2. Make all states from which 1 is not accessible absorbing states. (We can even combine them into one absorbing state.)

Then we can find out which absorbing state we get to first. (A Markov chain in which every state can reach an absorbing state is called an absorbing Markov chain.)