A Markov chain $X_0,X_1,X_2,\ldots$ has the transition probability matrix
$$P= \begin{bmatrix} 1 & 0 & 0 \\ \alpha & \beta & \gamma \\ 0 & 0 & 1 \end{bmatrix} $$
where the first row corresponds to $X_n = 0$ and the second row $X_n = 1$, and third row corresponds to $X_n = 2$.
We define $T = \min\{n\geq0: X_n=0 \ \ \text{or} \ \ X_n = 2\}$
I would like to find $v= E[T|X_0=1]$.
This question appears in the textbook Stochastic Modeling by Pinksy, and their explanation is: "Weighting these contingencies by their respective probabilities, we obtain for $v$":
$$ v = 1+\alpha\cdot (0)+\beta\cdot (v)+\gamma\cdot (0) = 1+\beta v $$
My question is two-fold:
1) What is the explicit notation for what is going on above? When they "weight" the probabilities, are they doing $E[T|X_0=1] = \sum_{k=0}^2 E[T|X_0=1, X_1=k]$?
2) What is a good way to see why there is the $1$ added above? I know the usual explanation is that the absorption time $T$ is always at least $1$, but how does it come out in the math?
The equation should be
$ \begin{align*} E(T\mid X_0=1)&=\sum_{k=0}^2 E(T\mid X_0=1,X_1=k)P(X_1=k\mid X_0=1) \\ &=E(T\mid X_0=1,X_1=0)\alpha+E(T\mid X_0=1,X_1=1)\beta+E(T\mid X_0=1,X_1=2)\gamma. \end{align*} $
The first and last expectations on the right become $1$ since we have hit $\{0,2\}$ in one time step. For the second expectation on the right, think of $\{X_1,X_2,\dots\}$ as a new Markov chain starting at $1$. The transition probabilities from $X_1$ to $X_2$ to $X_3$ (and so on) do not depend on the original starting point $X_0$, but only on the new starting point $X_1$. Since $X_1$ is still $1$, the new Markov chain $\{X_1,X_2,\dots\}$ can be thought of as restarting original one $\{X_0,X_1,\dots\}$ and so $E(T\mid X_0=1,X_1=1)=1+E(T\mid X_0=1)$. The $1$ on the right is due to the fact that we have still expended one time unit to 'reset' our Markov chain. Putting everything together gives $$E(T\mid X_0=1)=1\alpha+(1+E(T\mid X_0=1))\beta+1\gamma=1+E(T\mid X_0=1)\beta.$$