How can the second derivative of a function be used to find whether a point is a local minimum or local maximum?

520 Views Asked by At

In my A-level textbook, it states that if there is a stationary point at $x=a$ and $f''(a)>0$ then the point is a local minimum because "the gradient is increasing from a negative value to a positive value, so the stationary point is a minimum." I'm finding it difficult to understand what it means by "the gradient is increasing". Between a range of values such as $x=1$ and $x=2$, I can comprehend the concept that the gradient has increased, but it feels like the gradient at a single point has to be fixed. For example, the first derivative of $f(x)=x^2$ is $f'(x)=2x$, and the second derivative is $f''(x)=2$. At $x=1$, the tangent to the curve has the gradient $2x$, and at $x=2$, the gradient is $4x$. Therefore, the gradient has increased, but at a single point the gradient seems like it must be constant. Where have I gone wrong?

2

There are 2 best solutions below

0
On BEST ANSWER

For $a$ to be a stationary point, $f'(a)=0$.

The second derivative of the function represents the gradient of the gradient, and therefore can be used to find whether the gradient is increasing or decreasing.

If $f''(a)>0$, then this says the gradient is increasing. It can only "increase" from

  • a zero value to an infinite value or,
  • from a negative value to a positive value.

When the gradient increases from a negative value to a positive value, it means that it should have been zero at some point in between.

Now, if the gradient goes from negative to positive, then the curve changes its nature from decreasing to increasing. This happens only in the immediate neighborhood of a local minimum, if you think about it.

Here the blue line has a gradient less than 0, and the green one has a gradient greater than 0.

See what I mean?

Your textbook is a little incomplete. When they say the gradient has "increased", they mean the gradient's sign has changed in the neighborhood of $a$. That is from $a-h$ to $a+h$, where $h=\lim_{x\rightarrow 0, x>0}x$

Image credit: this website

2
On

I think this is a question that shows you're engaging with your material. That's good.

The last thing you need to understand why this works is the notion of continuity. Forget for the moment about second derivatives and think instead about any function $g(x)$ continuous on its domain. Then at any particular point $x=x_0$ we have that the values of $g(x)$ at points $x$ close to $x_0$ are close to $g(x_0).$ This is just what continuity means. When you really think about what this means if you set $g(x)=f''(x),$ then you get the answer to your problem.

The value of $f''$ at a particular point $x_0$ is enough to tell us whether $f'$ is increasing in an interval about $x_0$ because if $f''$ is continuous at that point, then its sign is constant in some interval about $x_0.$

So this test works if $f$ is twice continuously differentiable at its stationary points.


As @Ted Shifrin probably intended to point out, we can be sure that this test works even if $f''$ is discontinuous at $x_0$ as follows.

Think of the test as follows. Suppose we have a minimum value for $f$ at $x_0,$ for example, where $f$ is twice differentiable there. Then it follows that $f'$ increases near that point, which implies that $f''$ is positive near that point. In particular, $f''(x_0)>0.$

Another way to see this is by the intermediate value property of derivatives. Thus, if $f''$ exists in an interval about $x_0,$ then it assumes all values in the interval $(f''(x_0)-\epsilon,f''(x_0)+\epsilon)$ for $\epsilon>0$ arbitrarily small. Thus if $f''(x_0)>0,$ then it follows that there is some neighborhood of $f''(x_0)$ where $f''$ is positive, and the desired result follows, again.