Why do smaller intervals give more accurate results?

411 Views Asked by At

I'm doing a lot of derivatives. But I'm stumped on this one question.

Why is it that a smaller changes in x correspond to a more accurate value of the slope at that point? Or actually, how do we know that smaller changes mean give us the slope at that point?

I have done quite a bit of calculus without really understanding this and I would appreciate it if anyone could help me understand the reasoning behind this.

3

There are 3 best solutions below

0
On BEST ANSWER

On the graph of some function $y=f(x)$ (e.g. you could take $f(x)=x^2$ to visualise), pick two points $x_0$ and $x_0+h$ and their corresponding $y$ values. So, you have points $A=(x_0,f(x_0))$ and $B=(x_0+h,f(x_0+h))$. Now draw a line through $A$ and $B$.

Recall that the derivative of a function is essentially the gradient of its tangent line at a point. Now, if $h$ is large, the line $AB$ doesn't look like a tangent line at all. But as you shrink $h$, the line gets closer and closer to becoming a tangent line at $A$. If you don't believe me, just draw the picture yourself (or use some software like GeoGebra) and see with your own eyes.

Another way to see this: it can be shown, mathematically, that any "sufficiently nice" function $f(x)$ looks like a straight line when you zoom in enough, and this straight line is the tangent line. In other words, the tangent like to a graph at a point is usually a very good approximation to the function, near the point of tangency. I say "sufficiently nice" because some "not nice" functions like $f(x)=|x|$ don't satisfy this condition, because at $x=0$ there's a little bump. Roughly speaking, you need the function to be smooth and not have these sharp turns---this condition is known as "differentiability" and determines whether or not the derivative is defined. So, for a differentiable function, the points $A,B$ approximately lie on the same tangent line for small $h$, thus the slope of $AB$ is approximately the same as the derivative at $A$.

0
On

To me, Taylor series gives the best explanation.

The formula is $f(x+h) =f(x)+hf'(x)+h^2f''(x)+...+h^nf^{(n)}(x)/n!+... $

If we just look at the first two terms as an approximation we get

$f(x+h) \approx f(x)+hf'(x) $ with an error described by the next term which is proportional to $h^2$. This means that the smaller $h$ gets, the more accurate the approximation is.

7
On

A differentiable function is precisely a function where, as you say, smaller changes in $x$ correspond to a more accurate value of the slope at that point.

It is easy to produce examples of functions that are not differentiable, i.e. where smaller changes in $x$ do not produce better approximations to the slope. Any point where a function is not continuous is also a point where the function is not differentiable.

More exotically, there are functions which are continuous everywhere, but differentiable nowhere. These are known as Weierstrass functions. Roughly, the function oscillates too wildly at every point to have a well-defined slope.