Let $P$ $$P = \begin{pmatrix} 0 & \frac12 &\frac12 \\ \frac13 & \frac12 & \frac16 \\ 0 & 1 & 0\end{pmatrix}$$ describe a Markov chain with state space $\{1,2,3\}$. Assuming that $X_0 = 1$ almost surely, compute probability that chain returns to state $1$ before it hits state $3$.
My question: let's define $u(i)$ as probability of visiting state $1$ before state $3$, starting at $i$. So as I saw in other solutions (like Probability of reaching a state before a different state), I need a following set of equations:
$$ u(1) = 1 \\ u(2) = p_{21}u(1) + p_{22}u(2) + p_{23}u(3) \\ u(3) = 0 $$
I need $u(1)$, but... it is already equal to $1$, so what is wrong with my thinking?
Your $u(i)$ is the probability to be located at $1$ before you are located at $3$. This is not exactly what you want because as you say $u(1)$ is just $1$ but clearly it is possible to go to $3$ before you return to $1$ even if you start at $1$.
To fix it up, notice that $u$ does what you want if what you feed into it is the state after one step. Thus you can condition on the outcome of the first step and use the total probability formula:
$$P(\tau_1 < \tau_3 \mid X_0=1)=\sum_j P(\tau_1 < \tau_3 \mid X_0=1,X_1=j) P(X_1=j \mid X_0=1)$$
where $\tau_i$ is the first time $t>0$ where $X_t=i$. By the Markov property, that first factor in the sum is just $u(j)$.