Taylor series up to first order

67 Views Asked by At

I have an expression of the form:

$$\frac{1}{1+f(x)}$$ which I want to simplify by employing Taylor series up to linear order. I know that in general:

$$\frac{1}{1+x}=1-x+x^2-x^3+....$$ if $|x|<1$ or $|x|<<1$?

I want to know whether it is also applicable to the case above? i.e., $$\frac{1}{1+f(x)}=[1+f(x)]^{-1}=1-f(x)+f^2(x)-f^3(x)+....$$ with the condition that $|f(x)|<1$ or $f(x)<<1$ ? for any $f(x)$. A linear order approximation would yield $$\frac{1}{1+f(x)}=1-f(x)$$ which is a reasonable approximation if $f(x)<<1$.