How to mathematically prove the optimality conditions for a univariate function?

53 Views Asked by At

Consider a univariate function $f(x)$. I know the graphical intuition behind why $f'(x)=0$ at the extrema of $f$. But how do you prove it mathematically?

I start with the assumption of $x^*$ being a minimum (the maximum case can be proved likewise), then $f(x^*+h) \geq f(x^*)$ where $h\in \mathcal{N}(x)$, $\mathcal{N}$ being the neighborhood. This gives,

$f(x^*) + hf'(x^*) + \frac{h^2}{2!} f''(x^*) + \ldots \geq f(x^*)$.

How do I proceed from here to prove that $f'(x^*) = 0$ and $f''(x^*) \geq 0$ ? This seems so simple but is eluding me.

NOTE: I do not want the proof for $\nabla f(\mathbf{x^*}) = 0$ and $\nabla^2 f(\mathbf{x^*}) \succeq 0$ for multivariate function.

2

There are 2 best solutions below

3
On

Hint: if you want to take a different route, you can do this.

Starting from where you said $f(x^* + h) \geq f(x^*)$. Note that this can be rewritten as $$f(x^* + h) - f(x^*) \geq 0$$

Now consider the case where $h > 0$, then we can write $$\frac{f(x^* + h) - f(x^*)}{h} \geq 0$$

Do you see where this is going? Also consider the case where $h < 0$, and you should arrive at the conclusion you're looking for (without having to appeal to the second derivative).

0
On

When $f$ is differentiable at $a$ we can write $$f(x)-f(a)=m(x)\>(x-a)\qquad\bigl(x\in{\rm dom}(f)\bigr)\ ,\tag{1}$$ whereby the function $$x\mapsto m(x):=\cases{{f(x)-f(a)\over \mathstrut x-a}\quad&$(x\ne a)$\cr f'(a)&$(x=a)$\cr}$$ is continuous at $a$. If $f'(a)\ne0$ then $${\rm sgn}\bigl(m(x)\bigr)={\rm sgn}\bigl(f'(a)\bigr)\ne0$$ for all $x$ in a full neighborhood of $a$. From $(1)$ it then follows that $f(x)-f(a)$ takes as well positive as negative values in the immediate neighborhood of $a$. Under such circumstances $f$ cannot have a local extremum at $a$.

From this we conclude that $f'(a)=0$ is a necessary condition for a local extremum at $a$.