Finding $\sin 0.01$ to a first-order approximation (in the sense of a Taylor series expansion around $0$)

1k Views Asked by At

I am trying to understand what I need to calculate here exactly:

To a first-order approximation (in the sense of a Taylor series expansion around 0), what is $\sin 0.01$?

If I understood it correctly, I have to calculate the first order Taylor series for the function $f(x) = sin(x)$ where $x = 0.01$.

I get the following:

$$f(x) = \sin(a) + \cos(x)(x-a)$$

and if I plug in $x = 0$ and $a = 0.01$ I just get $0.01$ as the answer again.

2

There are 2 best solutions below

1
On BEST ANSWER

You're exactly right (in answer)! You should be expecting this because of the so-called Small Angle Approximation that $\sin x \approx x$ when $x \approx 0$. Then as $0.01 \approx 0$ we have $\sin(0.01) \approx 0.01$, whatever that all means.

Note however that the first order approximation is $$ T_1(x)= f(a) + f'(a)(x-a) $$ where you have $a=0$ and $x$ is a variable (which we will set to $0.01$). You have a slight mislabeling of your equation. So you would have $T_1(x)= 0 + 1(x-0)= x$ so that $\sin(0.01) \approx T_1(0.01)= 0.01$.

0
On

We have

$$f(x) = \sin(0) + \cos(0)(x-0)+o(x-0)$$

therefore the first order approximation is

$$p(x) = 0 + 1\cdot x=x$$