My question is as follows:
Consider a time-homogeneous Markov chain X = ($X_0$ , $X_1$ , . . .) with values in {1, 2, 3} and one-step transition probabilities given by the following matrix:
$$ \begin{bmatrix} 1/4 & 1/2 & 1/4 \\ 1/3 & 0 & 2/3 \\ 1/2 & 0 & 1/3 \\ \end{bmatrix} $$
Assume $X_0$ = 1. Let $T_{i1}$ =min{n≥1:$X_n$ =1} be the first time the chain returns to 1 if it starts from state i, i ∈ {1, 2, 3}. By using first step decomposition or by any other means, find the quantity
$μ_{11}$ :=E[$T_{11}$|$X_0$ =1] i.e the expected first time that the chain returns to 1, having started at 1.
My definition of first step decomposition is $f_{ij}^*$ = $p_{ij}$ + $\sum$$_{l\ne j}$$p_{il}$$f_{lj}^*$ where $f_{ij}^*$ = P($T_{ij}$ < $\infty$| $X_0$ = i). Recursively I seem to have found that $f_{11}^*$=$f_{21}^*$=$f_{31}^*$=1 which makes sense though I am not sure if this is even useful. I thought I could use these probabilities in the usual definition for expectation as for some discrete random variable but again I am not sure.
Any help would be much appreciated, thank you!
What you've found is the probability that we'll return to the first state, and it's good that that probability is $1$ for every state (we'd be in trouble if it weren't) but not what we're looking for.
There's a similar first-step decomposition for hitting times. If $\mu_{ij}$ is the expected number of steps it takes to go from state $i$ to state $j$, then $$ \mu_{ij} = 1 + \sum_{k \ne j} p_{ik} \mu_{kj} $$ The idea is that we take a step from $i$; if we end up in state $j$ we're done, and if we end up in state $k \ne j$, then we need $\mu_{kj}$ more steps on average, and we add $1$ to account for the step we've just taken.