Is it true in this case that the sequence of the minima of convex functions converges to the minimum?

272 Views Asked by At

Let be $u:\mathbb{R}\rightarrow(-\infty,+\infty]$ a convex function and I suppose that $u$ admits a point of minimum. I define:

$$(\varphi_\epsilon*u)(x)=\int_{\mathbb{R}^n}\varphi_\epsilon(y)u(x-y)dy, $$ where $\varphi_{\epsilon}$ is the standard mollifier. Let's introduce the notation: $$\tilde{u}_i=\varphi_{1/i}*u,\quad\forall i\in\mathbb{N}. $$

I know that the function $\tilde{u}_i$ is convex, that it converges pointwise to $u$ in $\mathbb{R}^n$ and uniformly on compact sets of $\mathbb{R}^n$.

If I denote with $y_i:=\min_{\mathbb{R}}u_i$, is it true that the sequence of the $y_i$ converges to $y=\min u$?

I think yes because I have the uniformly convergence on compact sets. However I cannot prove it. How can I do it?

Thanks for the help!

2

There are 2 best solutions below

3
On

Neither the convergence of derivative almost everywhere, nor uniform convergence on compact sets imply the convergence of minima on their own. Both modes of convergence leave the possibility of $u_i$ having some small values near infinity (if we don't know anything else about $u_i$).

Here we know that $u_i\ge u$ by Jensen's inequality. So it remains to prove that for every $\delta>0$ we have $\min u_i\le \min u+\delta$ for large $i$. To do this, let $x_0$ be a point of minimum of $u$, and take a neighborhood $V$ of $x_0$ in which $u\le u(x_0)+\delta$. When $i$ is large enough, the support of $\varphi_{1/i}$ is smaller than the size of $V$, which implies that $u_i(x_0)$ only involve the values of $u$ within $V$. Hence $u_i(x_0)\le u(x_0)+\delta$.

0
On

Assuming that your mollifier has compact support, e.g. within $[-\epsilon,\epsilon]$ and is positive, the result follows from the fact that $u_\epsilon(x)=(\phi_\epsilon \star u)(x)$ lies in the convex hull of the values of $u$ within the interval $[x-\epsilon,x+\epsilon]$.

Suppose the minimum is $m=u(x_0)$ and let $\delta>0$ be given. We want to find $\epsilon(\delta)$ so that the minimum of $u_\epsilon(x)$ is in the $\delta$ neighborhood of $x_0$ whenever $0<\epsilon<\epsilon(\delta)$.

By convexity, $m_1=\min_{|x-x_0|\geq \delta/2}u(x)>m$ so if $\epsilon<\delta/2$ then by what was said above regarding convex hull, for any $|x-x_0|\geq \delta$ we have $u_\epsilon(x)\geq m_1$. So now choose $\epsilon_1$ so that $\max_{|x-x_0|\leq \epsilon_1} u(x) <m_1$. Then for $\epsilon<\min\{\delta/2,\epsilon_1\}$ we have $u_\epsilon(x_0)<m_1$ and we are done.

Convexity is not really needed, it is enough to assume that $u$ is continuous, has a unique minimum and $$ \liminf_{|x|\rightarrow \infty} \;u(x) > \min u $$