Let $X$ be a Markov chain and $T_A$ the hitting time.
My text uses this in a proof:
$$\mathbb E[T_A \ | \ X_0=k ] = \sum_{l\in S} \mathbb P(X_1=l \ | X_0=k\ )(1+\mathbb E[T_A \ | \ X_0=l ])$$
and I don't understand why it holds.
I have tried using $\mathbf 1_C$ yet it wasn't very helpful:
\begin{align*} \mathbb E[\mathbf 1_C \ | \ X_0=k] & = \mathbb P(C \ | \ X_0=k) \\ & = \sum_{l\in S} \mathbb P(X_1=l \ | X_0=k\ )\mathbb P(C \ | \ X_1=l) \\ & = ? \end{align*}
How could I finish the argument?
Note that, for every state $\ell$, $$ E(T_A\,\mathbf 1_{X_1=\ell}\mid X_0=k)=E(T_A\mid X_1=\ell, X_0=k)P(X_1=\ell\mid X_0=k), $$ and that, if $T_A\geqslant1$ almost surely, that is, if $k\notin A$, then the Markov property at time $1$ yields $$ E(T_A\mid X_1=\ell, X_0=k)=1+E(T_A\mid X_0=\ell). $$