I'm learning how to differentiate a parabola, and to establish that $\frac{dy}{dx} = 2x$.
The textbook I'm using is trying to verify that $\frac{dy}{dx} = 2x$. It states "suppose $x=100$, and therefore $y=10,000$. Then let $x$ grow till it becomes $101$ (that is $dx=1$). Then the enlarged $y=101^2=10,201$. But if we agree that we may ignore small quantities of the second order, $1$ may be rejected as compared with $10,000$; so we may round of the enlarged y to $10,2000$, therefore $$\frac{dy}{dx}=200/1=200.$"
My question is that why can you round off the $1$ in $10,201$ to make $10,200$? In what situations is it acceptable to do this? And if you can regularly do this, then how can calculus be accurate? Or is calculus just an approximation?
I suggest looking at more values: $$ (100 + 1)^2 = 100^2 + 2 \cdot 100 \cdot 1 + 1^2 = 10201 \\ \Longrightarrow \frac{(100 + 1)^2 - 100^2}{1} = 201 $$ $$ (100 + 0.1)^2 = 100^2 + 2 \cdot 100 \cdot 0.1 + 0.1^2 = 10020.01 \\ \Longrightarrow \frac{(100 + 0.1)^2 - 100^2}{0.1} = 200.1 $$ $$ (100 + 0.01)^2 = 100^2 + 2 \cdot 100 \cdot 0.01 + 0.01^2 = 10002.0001 \\ \Longrightarrow \frac{(100 + 0.01)^2 - 100^2}{0.01} = 200.01 $$ The less we grow $x$ the less does the second order term influence the difference ratio; it gets closer to $200$. The argument in the book is just meant to very unrigorously exemplify this fact.