I've found the following assertion on an economics book:
For $r$ and $g$ small enough, $\frac{1+r}{1+g}\approx 1+r-g$
(where $r$ is the interest rate and $g$ is the growth rate of the economy)
I would like to know why this is true. I've tried to find the solution by myself, but I really don't know where to start from. What kind of approximation is this?
This is just a Taylor expansion of the function $f(r,g) = \frac{1+r}{1+g}$ at the point $(r,g) = (0,0)$.
The partial derivatives of $f$ are $$\frac{\partial f}{\partial r} = \frac{1}{1+g}$$ and $$\frac{\partial f}{\partial g} = -\frac{1+r}{(1+g)^2}.$$
Hence, the Taylor expansion of first order at $(0,0)$ is $1 + r - g$.