In Durrett's book Chapter 5 Theorem 4.6, It said that if p is irreducible and has stationary distribution, then $\pi_i=\frac{1}{E_iT_i}$. Where $T_i$ is the first time the markov process returns back to state i. $E_iT_i$ is the expected number of times needed to return back to state i given the initial state is $i$.
Also in the proof of theorem 4.7 said, if p is ireducible and i is positive recurrent, then, $\pi_j=\frac{\sum_{n=0}^{\infty}P_i(X_n=j, T_i \gt n)}{E_iT_i}$ is a stationary distribution.
Does that mean $\frac{\sum_{n=0}^{\infty}P_i(X_n=j, T_i \gt n)}{E_iT_i} = \frac{1}{E_jT_j}$? How can we prove that?
And also how can we prove that $\sum\frac{1}{E_jT_j} = 1$ in an irreducible Markov chain with stationary distribution $\pi$?
I've found the answer myself, it's actually quite easy to prove:
In Durret's book, it's been proved that if $P$ is irreducible and recurrent, then $\mu_a$ is unique up to a constant multiple, and in its prove of this theorem, we found that the constant is $\nu(a)$, i.e. $\nu(x)=\nu(a)\mu_a(x)$.
That means $\mu_i(j)\mu_j(k)=\mu_i(k)$, here $\mu_i(j)$ is the constant.
If we sum both sides over $k$,
$$ \sum_{k}\mu_i(j)\mu_j(k)=\sum_{k}\mu_i(k) $$
We get
$$ \mu_i(j)E_j(T_j)=E_i(T_i) $$
We are done. We get
$$ \frac{\mu_i(j)}{E_i(T_i)} = \frac{1}{E_j(T_j)} $$
I explained it in my video today: https://youtu.be/lZrzjxQ3YF8