MIT old Calculus Exam Question about estimating $\sin(\pi+\frac 1 {100})$ to two decimal places

102 Views Asked by At

In the exam 2 of this page the first question asks to approximate $\sin(\pi + \frac 1 {100})$ to two decimal places. The solution they give seems incorrect: they simply say $\sin(\pi+\frac 1 {100}) \approx \sin(\pi)+\cos(\pi)\frac 1 {100} = 0-1\frac 1{100} = -0.01$. Nowhere did they prove that this approximation is good enough to guarantee two decimal places of correctness. Am i missing something?

1

There are 1 best solutions below

3
On

Apparently they used the angle addition identity $$\sin(x+y) = \sin x \cos y + \cos x \sin y.$$ This gives $$\sin(\pi + 0.01) = \sin \pi \cos (0.01) + \cos \pi \sin (0.01) = - \sin (0.01).$$

This much is algebraically valid, no approximation is used here. Then to obtain $\sin (0.01)$, the approximation $$\sin x \approx x + O(x^3)$$ is used. More precisely, $$\sin x = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \cdots$$ being the series expansion about $x = 0$, it is clear that for $|x| < 10^{-2}$, the error term will be less than $(10^{-2})^3 = 10^{-6}$. This is the missing justification. Hence $$\sin (0.01) \approx 0.01 + \epsilon$$ where $|\epsilon| < 0.000001.$