Prove that $f(x)\ge g(x)$ on $[a,b]$ given conditions on second derivatives and values at endpoints.

37 Views Asked by At

$f(x)$ and $g(x)$ are defined on $[a,b] \subset \mathbb R$ such that: $$f(a) = g(a)$$$$ f(b) = g(b)$$

Prove that $f(x)\ge g(x)$ for $x \in (a,b)$ given that $$f''(x) < g''(x)$$ for $x \in (a,b).$ $f(x)$ and $g(x)$ are not given differentiable at endpoints.

For example: consider $f(x) = x$ and $g(x)=x^2$, then for $x\in(0,1)$ we have $f(x)\ge g(x)$.

This is obvious because $x$ is linear while $x^2$ is concave up.

I need a rigorous proof of this as I'm using it as a proposition to prove something for an assignment.

Thanks.

1

There are 1 best solutions below

4
On BEST ANSWER

We assume that $f$ and $g$ are continuous in $[a,b]$ and differentiable two times in $(a,b)$ (without the continuity at $a$ and at $b$ the claim is false).

Consider the function $h(x)=:f(x)-g(x)$. Then $h(a)=h(b)=0$ and $h'(t)=0$ for some $t\in (a,b)$. Moreover $h''<0$ in $(a,b)$, implies that $h'$ is strictly decreasing in $(a,b)$. Hence $h'>0$ in $(a,t)$ and $h'<0$ in $(t,b)$, that is $h$ is increasing in $(a,t)$ and it is decreasing in $(t,b)$ which implies that $h>0$ in $(a,b)$.

So we may conclude that $f(x)>g(x)$ for all $x\in(a,b)$.