Please suppose you have an unknown function r(x).
This function r(x) is defined in the range: [-5; 5]
You know that:
r(0) = 1;
r'(0) = -1;
r"(0) = 1.
Please estimate the value of r(x) in the point 1/10 (= 0.1), explaining the method or the theorems you've used to estimate.
Thank you in advance for your kind help.
Please suppose that in a certain classroom, Taylor Series concept has not yet been explained, so you aren't authorized to use Taylor Series to solve this problem.
using the concept of taylor series $$ P(x) = \frac{r(0)}{0!} + x\frac{r'(0)}{1!} + x^2\frac{r''(0)}{2!} + \ldots $$ as you are requiring an approximation stop at degree $ x^2$ then $$r(x) \approx 1 - x + \frac{x^2}{2!} $$ at $x =0.1$ $$ r(0.1) \approx \frac{181}{200} = 0.905$$