Continuous function with non-negative second derivative in the weak sense is convex

495 Views Asked by At

I am currently working through a section of Peter Petersen's Riemannian Geometry in which he talks about weak second derivatives of functions. I am trying to work through the details of why a function on a Riemannian manifold $(M,g)$ with non-negative Hessian in the weak sense is convex.

I understand how one can reduce the problem to the case where $M$ is the real line with the Euclidean metric by precomposing with unit speed geodesics, so I have been able to reduce the problem to the following:

Proposition: Let $f$ be a continuous function defined on an open interval in $\mathbf{R}$. We say that $f''(p)\geq 0$ in the weak sense if for every $\varepsilon>0$ there exists a smooth function $f_\varepsilon$ defined in a neighborhood of $p$ such that

(1) $f(p)=f_\varepsilon(p)$

(2) $f\geq f_\varepsilon$

(3) $f_\varepsilon''(p)\geq -\varepsilon$.

Then if $f''\geq 0$ everywhere in the weak sense, then $f$ is convex.

I have tried for a while to come up with a direction to move in from here. My ideas keep running into the problem that the definition of $f_\varepsilon$ only necessarily has nice properties in a small neighborhood of $p$.

This makes the naive approach of trying to directly apply the definition of convexity more difficult (compared to when we assume our function is actually twice differentiable). Given $x_1$, $x_2$ and $t\in(0,1)$, for example, we cannot guarantee the function $f_\varepsilon$ defined around $p=tx_1+(1-t)x_2$ is even defined at $x_1$ or $x_2$.

I have a feeling I am over-thinking this problem. I would appreciate any input/hints anyone would be willing to provide as to how I might proceed from here.

2

There are 2 best solutions below

2
On

Here's an outline of what I believe to be a proof. Simpler ones may exist, but this is what I came up with. I haven't really been pedantic at all about details (as you'll see) but maybe this will give you some ideas.

Lemma. If $f'' \geq 0$ everywhere in the weak sense, then for every smooth, compactly supported function $\varphi$ on $\mathbb{R}$ with $\varphi \geq 0$, $$ \int f \varphi'' \geq 0. $$

Proof sketch. For any $\epsilon$, show that one can cover the support of $\varphi$ with finitely many disjoint half-open intervals $J_1, \dots, J_k$ such that there exists for each $1 \leq i \leq k$ a smooth function $f_i$ defined on $J_i$ and satisfying $f \geq f_i$ and $f_i'' > -\epsilon$ on $J_i$. Estimate the integral above by breaking it up into integrals over the $J_i$ and using integration by parts and the functions $f_i$.

Proposition. If $f'' \geq 0$ everywhere in the weak sense, then $f$ is convex.

Proof sketch. By adding a linear function to $f$, it suffices to show that for $a,b \in \mathbb{R}$ with $f(a) = f(b) = 0$, we have $f(t) \leq 0$ for $a < t < b$.

Suppose otherwise. Then there is $c$ with $a < c < b$ such that $f(c) = C > 0$. By the intermediate value theorem there exists $c_1, c_2$ with $a < c_1 < c < c_2 < b$ such that $f(c_1) = f(c_2) = C/4$. By continuity, the values of $f$ are close (say, within $\epsilon$) to $c$ or $c/4$ respectively within an $\eta$-neighborhood of the points $c$ and $c_1, c_2$ respectively.

One now constructs an appropriate smooth function $\psi$ supported on the $\eta$-neighborhoods of these three points, positive near $c_1$ and $c_2$ and negative near $c$, and sufficiently well-controlled so that (1) There exists a second antiderivative $\varphi$ of $\psi$ which is smooth and of compact support, and (2) The integral $\int f \varphi''$ can be estimated (and proved to be negative). Together with the lemma, this proves the proposition.

0
On

I got stuck exactly at the same point. Here is a quite simple argument to prove your Proposition.

First of all, you can reduce to prove this:

Lemma: Let $f$ be a continuous function defined on an open interval in $\mathbf{R}$. Suppose for every $p$ there exists a convex function $g_p$ defined in a neighborhood of $p$ such that

(1) $f(p)=g_p(p)$

(2) $f\geq g_p$.

Then $f$ is convex.

To deduce the Proposition from the Lemma, fix any $\epsilon>0$ and set $h_\epsilon(x):=\epsilon x^2$. For every $p$ choose a function $f_\epsilon$ (defined on $U_p$) as in the hypothesis and observe that $f+h_\epsilon\ge f_\epsilon+h_\epsilon$ (on $U_p$), with equality at $p$.
Moreover $(f_\epsilon+h_\epsilon)''(p)\ge -\epsilon+2\epsilon>0$, so wlog $g_p:=f_\epsilon+h_\epsilon$ is convex on $U_p$.
Now the Lemma implies that $f+h_\epsilon$ is convex. As $f+h_\epsilon\to f$ (uniformly on compact sets) as $\epsilon\to 0$, we conclude that $f$ is convex too.

Proof of the Lemma: observe that $f$ is convex iff, for any linear function $\phi$, the sublevel sets $\{f-\phi\le \alpha\}$ are convex for any $\alpha\in\mathbb{R}$ (tell me if this is not clear).
So pick any linear function $\phi$ and put $F:=f-\phi$. Let $x_1,x_2\in\{F\le\alpha\}$; we have to prove that $[x_1,x_2]\subseteq\{F\le\alpha\}$.
Assume by contradiction that $M:=\max_{[x_1,x_2]}F>\alpha$ and call $p\in(x_1,x_2)$ the first point (of the interval) at which $f$ equals $M$ (i.e. $p:=\min f^{-1}(M)\cap [x_1,x_2]$). Now $F$ still satisfies the hypotheses (of the Lemma), so we can find a convex $G:U\to\mathbb{R}$ ($U\subseteq (x_1,x_2)$ being a neighborhood of $p$) such that $F\ge G$ with equality at $p$.
So $G$ reaches its maximum at $p$ (since $G\le F\le F(p)=G(p)$ on $U$); but convexity easily implies that $G\equiv M$ on all of $U$: for example we know that the slope $s(x):=\frac{G(x)-G(p)}{x-p}$ is an increasing function on $U\setminus\{p\}$, but $s(x)\ge 0$ if $x<p$ and $s(x)\le 0$ if $x>p$ since $G(p)$ is the maximum value. If you draw a picture everything becomes clear. Thus $r\equiv 0$, so $G\equiv M$. Finally, since $F\ge G$, we get $F\equiv M$ on $U$; but this contradicts the fact that $F<M$ on $[x_1,p)$. $\blacksquare$