Consider a finite directed graph with $n>1$ nodes and $2$ edges leaving each node.
In a statistical experiment: starting from node $0$, we chose the next node by fair coin throw, until reaching node $1$. The outcome of the experiment is the number of steps, or $0$ if it is reached a node that can't reach $1$.
What the name for this experiment? For the distribution $p$ of its outcome? For the property that any node reachable from $0$ can reach $1$, giving $p(0)=0$? Or more basically for the property that any node can be reached from any other?
I'm not sure there's a special word for the experiment, but:
I don't know a specific word for the case where the hitting probability is $1$, but here is some related terminology:
If we're looking for the hitting probability, it's common to transform the Markov chain as follows:
Then we can find out which absorbing state we get to first. (A Markov chain in which every state can reach an absorbing state is called an absorbing Markov chain.)