Consider a gambler betting on the outcome of a sequence of independent fair
coin tosses. If the coin comes up heads, she adds one dollar to her purse; if the coin lands tails up, she loses one dollar. If she ever reaches a fortune of $n$ dollars, she will stop playing. If her purse is ever empty, then she must stop betting.
Now the question is to find the expected time when the gambler will stop betting. i.e. Either she reaches $0$ dollars or $n$ dollars. We will assume that initially she has $k$ ($0<k<n$) dollars.
Let $f_k$ be the expected time to reach $n$ or $0$ dollars starting from $k$ dollars. Therefore $f_k = \frac{1}{2}\cdot(1 + f_{k+1}) + \frac{1}{2}\cdot(1 + f_{k-1})$ .
Intuitively, this equation is clear to me. Because with probability $\frac{1}{2}$, the fortune either increases or decreases and we add $1$ because it takes time equal to $1$ unit for fortune to increase or decrease by $1$.
Can someone help to derive this equation mathematically without any intuition. I want a rigorous proof.
Let $(\xi_j)_{j \in \mathbb{N}}$ be a sequence of independent random variables such that $\mathbb{P}(\xi_j=1) = \mathbb{P}(\xi_j=-1)=1/2$ for all $j \in \mathbb{N}$. Denote by
$$S_m^{(\xi)} := \xi + \sum_{j=1}^m \xi_j$$
the fortune of the gambler (with initial fortune $\xi$) after $m$ coin tosses. If we define
$$\tau^{(k)} := \inf\{m \geq 0; S_m^{(k)} \in \{0,n\}\},$$
then the quantity you are interested in is
$$f_k = \mathbb{E}(\tau^{(k)}).$$
Let us introduce another (shifted) random walk by
$$T_m^{(\eta)} := \eta + \sum_{j=2}^m \xi_j$$
(here $\eta$ is some given random variable). For any $1 \leq k \leq n-1$ we have
$$\begin{align*} \tau^{(k)} &= \inf\{m \geq 0; (k+\xi_1)+\ldots+\xi_m \in \{0,n\}\} \\ &= \inf\{m \geq 0; T_m^{(k+\xi_1)} \in \{0,n\}\} \\ &= 1+ \inf\{m \geq 0; T_{m+1}^{(k+\xi_1)} \in \{0,n\}\}. \end{align*}$$
Thus,
$$\begin{align*} \tau^{(k)} &= 1+ 1_{\{\xi_1=1\}} \underbrace{\inf\{m \geq 0; T_{m+1}^{(k+1)} \in \{0,\ldots,n\}\}}_{=: \sigma_{k+1}} \\ &\quad + 1_{\{\xi_1=-1\}} \underbrace{\inf\{m \geq 0; T_{m+1}^{(k-1)} \in \{0,\ldots,n\}\}}_{=: \sigma_{k-1}}.\end{align*}$$
Since $(S_m^{(k+1})_{m \geq 0}$ equals in distribution $(T_{m+1}^{(k+1)})_{m \geq 0}$ it follows that $\sigma_{k+1}$ equals in distribution $\tau_{k+1}$. Similarly, we find $\tau_{k-1} = \sigma_{k-1}$ in distribution. In particular, $$f_{k+1} = \mathbb{E}(\sigma_{k+1}) \qquad f_{k-1} = \mathbb{E}(\sigma_{k-1}).$$ On the other hand, $\sigma_{k-1}$ and $\sigma_{k+1}$ are independent from $\xi_1$. By conditioning with respect to $\xi_1$ we find
$$\begin{align*}f_k = \mathbb{E}(\tau^{(k)}) &= \mathbb{E}\bigg( \mathbb{E}(\tau^{(k)} \mid \xi_1) \bigg) \\ &= \mathbb{E}\bigg( \mathbb{E}(1+ 1_{\{\xi_1=1\}} \sigma_{k+1} + 1_{\{\xi_1=-1\}} \sigma_{k-1} \mid \xi_1) \bigg) \\ &= \mathbb{E}(1+ 1_{\{\xi_1=1\}} \underbrace{\mathbb{E}(\sigma_{k+1} \mid \xi_1)}_{\mathbb{E}(\sigma_{k+1}) = f_{k+1}} + 1_{\{\xi_1=-1\}} \underbrace{\mathbb{E}(\sigma_{k-1} \mid \xi_1)}_{\mathbb{E}(\sigma_{k-1}) = f_{k-1}} \bigg) \\ &= 1+ \frac{1}{2} f_{k+1} + \frac{1}{2} f_{k-1}. \end{align*}$$