Let P be a transition matrix of a Markov Chain.
Let $p_{ij}^{n}=P\left [ X_{n}=j | x_{0}=i\right ]$ be the transition probability of a Markov Chain from an initial state i to a final state j in n-steps.
If we define the time of first visit to a state i by a Markov Chain from an initial state i as
$T_{ij}=min\left \{n \geq 1 :X_{n}=i \right \}$
The probability of a Markov Chain visiting state i for the first time $\left ( n \geq1 \right )$ from an initial state i can be defined as
$f_{ii}^{n}=P\left [ T_{ii}=n \right ]$.
$\sum_{n=1}^{\infty}f_{ii}^{n}$ seems to be the cumulative probability of a Markov Chain in an initial state i returning to state i for the first time but over all possible n-steps.
$\sum_{n=1}^{\infty}p_{ii}^{n}$ seems to be the cumulative probability of a Markov Chain in an initial state i returning to state i over all possible n-steps.
Am I correct?
If not, how can I understand the difference between $\sum_{n=1}^{\infty}f_{ii}^{n}$ and $\sum_{n=1}^{\infty}p_{ii}^{n}$? What is the subtle difference between the two?
The interpretation of $\sum p^n_{ii}$ is the expected number of visits of the walk to state $i$ by time $n$.* The interpretation of $\sum f^n_{ii}$, on the other hand, is what you described in your question.
To illustrate the difference between the two, consider a finite, irreducible Markov chain. Every state in such a chain will be visited infinitely many times. Note that $\sum p^n_{ii} \to \infty$ in this case, because the chain will visit state $i$ many times in the long run. However, $\sum f^n_{ii}$ will never exceed $1$.
*Proof: \begin{align*} \sum_{n=1}^{\infty} p^n_{ii} &= \sum_{n=1}^{\infty} \mathbb E_i [1_{X_n = i}] \\ &= \mathbb E_i \sum_{i=1}^\infty 1_{X_n = i} \end{align*}