Let be $u:\mathbb{R}\rightarrow(-\infty,+\infty]$ a convex function and I suppose that $u$ admits a point of minimum. I define:
$$(\varphi_\epsilon*u)(x)=\int_{\mathbb{R}^n}\varphi_\epsilon(y)u(x-y)dy, $$ where $\varphi_{\epsilon}$ is the standard mollifier. Let's introduce the notation: $$\tilde{u}_i=\varphi_{1/i}*u,\quad\forall i\in\mathbb{N}. $$
I know that the function $\tilde{u}_i$ is convex, that it converges pointwise to $u$ in $\mathbb{R}^n$ and uniformly on compact sets of $\mathbb{R}^n$.
If I denote with $y_i:=\min_{\mathbb{R}}u_i$, is it true that the sequence of the $y_i$ converges to $y=\min u$?
I think yes because I have the uniformly convergence on compact sets. However I cannot prove it. How can I do it?
Thanks for the help!
Neither the convergence of derivative almost everywhere, nor uniform convergence on compact sets imply the convergence of minima on their own. Both modes of convergence leave the possibility of $u_i$ having some small values near infinity (if we don't know anything else about $u_i$).
Here we know that $u_i\ge u$ by Jensen's inequality. So it remains to prove that for every $\delta>0$ we have $\min u_i\le \min u+\delta$ for large $i$. To do this, let $x_0$ be a point of minimum of $u$, and take a neighborhood $V$ of $x_0$ in which $u\le u(x_0)+\delta$. When $i$ is large enough, the support of $\varphi_{1/i}$ is smaller than the size of $V$, which implies that $u_i(x_0)$ only involve the values of $u$ within $V$. Hence $u_i(x_0)\le u(x_0)+\delta$.