Let $(X_n)_{\geq 0}$ be a symmetric random walk on $\mathbf{Z}$. Let $T := \min (n\geq 0 : X_n = 0 \text{ or } X_n = N)$, with $N\in\mathbf{N}$, and $f(n) := \mathbf{P}(X_T = N | X_0 = n)$ for $0 < n < N$. I'm trying to prove that $f(n) = (f(n + 1) + f(n - 1))/2$. The intuition is that we arrive at $n - 1$ or $n + 1$ with equal probability, and then the process "starts again" so that $f(n + 1) = \mathbf{P}(X_T = N | X_0 = n + 1)$ should be equal to $\mathbf{P}(X_T = N | X_1 = n + 1)$. So we have
\begin{align} f(n) = \mathbf{P}(X_T = N | X_0 = n, X_1 = n - 1)\mathbf{P}(X_1 = n - 1 | X_0 = n) + \mathbf{P}(X_T = N | X_0 = n, X_1 = n + 1)\mathbf{P}(X_1 = n + 1 | X_0 = n) \end{align}
and thus, using the Markov property, $$ \begin{equation} 2f(n) = \mathbf{P}(X_T = N | X_1 = n - 1) + \mathbf{P}(X_T = N | X_1 = n + 1). \end{equation} $$ Now, let $Y_n := X_{n + 1}$ for $n\geq 0$, and $S := \min (n\geq 0 : Y_n = 0 \text{ or } Y_n = N)$. Then $S = T - 1$ and $X_T = X_{S + 1} = Y_S$, so
\begin{equation} \mathbf{P}(X_T = N | X_1 = n + 1) = \mathbf{P}(Y_S = N | Y_0 = n + 1). \end{equation}
But I don't know how to prove $\mathbf{P}(Y_S = N | Y_0 = n + 1) = \mathbf{P}(X_T | X_0 = n + 1)$.
Also, I might be seeing this all wrong, because even if the transition probabilities in one step of $Y$ are equal to those of $X$, the $T$-step transition probabilities of $X$ aren't the same as the $S = (T - 1)$-step transition probabilities of $X$, so after taking the first step the probabilities might change.
But then, how do you get $f(n) = (f(n - 1) + f(n + 1))/2$? I'm really confused.
Note that $Y$ and $X$ are Markov Chains with the same transition probabilities. That means that when one conditions to $Y$ and $X$ starting from the same state, both will evolve according to the same law, for example because $$\mathbb{P}(X_{n+1}=i_{n+1},\dots,X_{1}=i_{1}|X_{0}=i_{0}) = p_{i_{0},i_{1}}\cdots p_{i_{n},i_{n+1}} = \mathbb{P}(Y_{n+1}=i_{n+1},\dots,Y_{1}=i_{1}|Y_{0}=i_{0}).$$
In particular, this proves what you wanted to, because as $S$ and $T$ are first hitting times for the same subset of states for identically distributed Markov Chains they will also be identically distributed, and then we are just asserting that two identically distributed random variables give the same probabilities: $$\mathbb{P}(X_T = N | X_0 = n+1) = \mathbb{P}(Y_S = N | Y_0 = n+1).$$ You can think in terms of paths like above for further intuition, or think more simply of random variables for the fundamental intuition as to why this is the case.
Extra clarification: while it would be true that if $X$ and $Y$, or $S$ and $T$, were involved in the same event there would be a difference (consider something like $\{X_{1}=2, Y_{3}=5, T=8, S=7\}$), since in this case they are two distinct events and only one pair appears what we are doing can be thought of as just a renaming.