Let the following discrete-time Markov chain (DTMC) be defined with its transition probabilities as follows: $$ \begin{aligned} &p_{0,0}=\frac{b_{0}}{b_{0}+b_{1}}, \quad p_{i, i+1}=\frac{b_{i+1}}{b_{i}+b_{i+1}} \text { for } i \geq 0 \text { and } p_{i, i-1}=\frac{b_{i}}{b_{i}+b_{i+1}} \text { for } i \geq 1\\ &\text { where }\left\{\boldsymbol{b}_{\boldsymbol{i}}\right\} \text { is a strictly positive sequence such that } \sum_{i=0}^{\infty} \boldsymbol{b}_{i}<\infty . \end{aligned} $$ Let $\boldsymbol{\mu}_{t, j}$ be the mean- time needed to go from state $\boldsymbol{i}$ to state $\boldsymbol{j}$ and enter into state $\boldsymbol{j}$ for the first time.
I want to know how to obtain $\boldsymbol{\mu}_{i, i}$ and $\boldsymbol{\mu}_{i, i+1}$ in terms of $\boldsymbol{b}_{i}$ ??
My approach was to go with the definition of the mean-time as $\mu_{i j}=\sum_{n=1}^{\infty} n f_{i j}^{(n)}$ where $f_{i j}^{(n)}$ is the first passage probability from state I to j in n steps. My issue is that I couldn't compute this probability in terms of $b_{i]$. The chain is infinite and the possibilities of reaching the state in n steps are complicated to grasp. Maybe this is not the way to go but this was the only approach I had in mind.
Any suggestions??