Taylor approximation and monotonicity of the function

865 Views Asked by At

Consider a real $f(x)$. If the Taylor approximation of the function is, say, decreasing, can I conclude that the function (near the point were the approximation is made) is decreasing too?

As an example

$$f(x)=\mathrm{lg}(1+\frac{1}{x}) \sim_{x \to \infty} \frac{1}{x}$$ Can I therefore say that,since $\frac{1}{x}$ is decreasing , then also $f(x)$ is decreasing when $x \to \infty$?

Is there any theorem that links the Taylor approximation with the monotonicity of the function?

1

There are 1 best solutions below

1
On BEST ANSWER

Consider $f(x) = x + 2x^2\sin(1/x),x\ne 0$ with $f(0)=0.$ Then $f'(0) = 1.$ Thus the Taylor polynomial of degree one for $f$ at $0$ is just $x,$ which is increasing. But $f$ is not increasing any any neighborhood of $0.$ To show that, compute $f'(x)$ and verify that $f'(x) < 0$ along a sequence $\to 0.$

A more ambitious example is $f(x) = x+ e^{-1/x^2}\sin(e^{1/x^4}).$ Here $f\in C^\infty(\mathbb R),$ $f(0) = 0, f'(0)=1,$ but all higher derivatives of $f$ at $0$ are $0.$ Thus all Taylor polynomials of $f$ at $0$ of degree greater than $0$ are just the polynomial $x.$ But like above, $f'(x) < 0$ along a sequence $\to 0.$