So the problem is: Let $THT$ be the (random) number of times we need to flip a coin before we have gotten a Heads followed by a Tails. Consider Markov chain with transition probability with state space {$HHH,HHT,HTH,HTT,THH,THT,TTH,TTT$}:
$$ \begin{bmatrix}0.5 &0.5&0&0&0&0&0&0 \\0 & 0&0.5&0.5&0&0&0&0\\0&0&0&0&0.5&0.5&0&0\\0&0&0&0&0&0&0.5&0.5\\0.5 &0.5&0&0&0&0&0&0 \\0 & 0&0.5&0.5&0&0&0&0\\0&0&0&0&0.5&0.5&0&0\\0&0&0&0&0&0&0.5&0.5\end{bmatrix}$$
what is the expected waiting time for $HHH$?
The answer says:
Let $g(XXX)$ be the average time for the Markov chain to go from state $XXX$ to state $HHH$, where $X$ can be either $T$. Then,
$g(HHH)=0$
$g(HHT)=g(THT)=0.5g(HTH)+0.5g(HTT)+1$
$g(HTH)=g(TTH)=0.5g(THH)+0.5g(THT)+1$
$g(HTT)=g(TTT)=0.5g(TTH)+0.5G(TTT)+1$
$g(THH)=0.5g(HHT)+1$
Then,
$g(HHT)=g(THT)=14$
$g(HTH)=g(TTH)=12$
$ g(HTT)=g(TTT)=14 $
$ g(THH)=8 $
So the average number of tosses to get $HHH$ is $3+0.125(0+8+4*14+2*12)=14$
My question is :Where do the equations come from? For example, why is $0.5g(HTH)+0.5g(HTT)+1$ the avearge time for $HHT$ to return to $HHH$?
It is assumed that, for long runs of random walks, the average time of going from one state to another state is a fixed number. Now, we just want to make a system of equations to find them. For the example, that you have given, if you want to go from $HHT$ to $HHH$, you have to go to either $HTH$ or $HTT$, in the first step. With probability $0.5$, you go to $HTH$ and you know the expected time to get to $HHH$, from it. The same is true for the second possibility (to $HTT$). So, you just average them and add a $1$ to them, just because you have considered that you have taken one step to go from $HHT$ to the two adjacent states.