In a book on economics, I have read that $\sum_{t = 1}^{\infty} \frac{r\Delta D}{(1 + r)^t} = \frac{r\Delta D}{r} = \Delta D$.
$t$ is the index of the current time period; $r$ represents the interest rate; $\Delta D$ is the change of debts.
Why can this equation be solved like this? Let’s say $r = 2$. Then, $\sum_{t = 1}^{\infty} \frac{2 \Delta D}{(1 + 2)^t}$ should result in something like $\frac{2\Delta D}{3}$, shouldn’t it?
What am I getting wrong here? Why does $\sum_{t = 1}^{\infty} \frac{r}{(1 + r)^t} = \frac{r}{r}$?
$$r > 0 \implies \frac{1}{1+r} < 1$$
then use GP.
For $|x| < 1$,
$$\sum_{i=0}^{\infty}x^i = \frac{1}{1-x}$$