The following quantity $\tilde\pi$ is defined in the textbook Markov chains and mixing times by David A. Levin. Here $\tau_z^+ = \min \left\{t\geq 1| X_t = z\right\}$.
Let $z \in \mathcal{X}$ be an arbitrary state of the Markov chain. We will closely examine the average time the chain spends at each state in between visits to $z$. To this end, we define $$ \begin{aligned} \tilde{\pi}(y) &:=\mathbf{E}_{z}(\text { number of visits to } y \text { before returning to } z) \\ &=\sum_{t=0}^{\infty} \mathbf{P}_{z}\left\{X_{t}=y, \tau_{z}^{+}>t\right\} \end{aligned} $$
I am having trouble understanding why the second equality above is true. I tried to use the fact that for non-negative integer valued random variables, $\mathbb{E}T = \sum_{t\geq0} \mathbb{P}(T>t).$ But I couldn't prove that. Any help is appreciated.
Let $N$ denote the number of visits to $y$ before returning to $z$. Let $I_t$ be the random variable that is 1 if $X_{t}=y, \tau_{z}^{+}>t$ and zero otherwise. Then by definition, $N = \sum_t I_t$.
This is an ubiquitous trick in probability. Express a random variable that counts something as a sum of 0/1 random variables (also called indicators).