Nonsmooth optimization approximation

111 Views Asked by At

Say I want to minimize a real valued, nonsmooth function $f(x)$ (gradient is not defined at some minima).

Further let

$$ g(x, \epsilon) $$

be a smooth approximation of $f(x)$ with

$$ \lim_{\epsilon\rightarrow0} \,g(x,\epsilon) = f(x) \,. $$

Let the minimum of $g$ be a function of $\epsilon$ called $m(\epsilon)$ and lets assume the minimum actually exists.

Question: Is $\lim_{\epsilon \rightarrow 0} m(\epsilon)$ also a minimum of $f$?

Example: Let $f(x) = |x|$. Then a smooth approximation with the properties described above is given by

$$ g(x, \epsilon)=\sqrt{x^2 + \epsilon}\,. $$

The minimum of $g$ is obtained at $x = 0$ and is a function of $\epsilon$ as: $m(\epsilon) = \sqrt{\epsilon}\,$.

Now taking the limit gives

$$ \lim_{\epsilon\rightarrow0}\,m(\epsilon) = \lim_{\epsilon\rightarrow0}\,\sqrt{\epsilon} = 0 $$

which is also the minimum of the original $f(x)$. The question is now if this is always the case.

1

There are 1 best solutions below

5
On BEST ANSWER

No, this is not true in general. Think of the function $g(x,\epsilon) = \epsilon x^3$. For any $\epsilon$, this function has an $\inf$ of $-\infty$. So $m(\epsilon) = -\infty$ (or not defined as this is just infimum and not the minimum). But as $\epsilon$ goes to zero, this function converges to $f(x) = 0$ which has minimum $0$.

For the case where the minimum exists, it is still not true in general. Consider the function $f(x) = 0$. Now assume $h(x)$ is a smooth function which $h(1) = h(-1) = 0$ and $h(0) = 1$ and $h(x)=0$ outside the interval $[-1,1]$. Now lets take a look at the functions $g(x,\epsilon) = -h(x-\frac{1}{\epsilon})$. You can easily see that $\lim_{\epsilon\rightarrow 0} g(x,\epsilon) = f(x)$ for all $x$. But $m(\epsilon) = -1$ and does not converge to $\min_x f(x) = 0$.

One condition under which this would be true is when the functions $g(x,\epsilon)$ are greater than or equal to $f(x)$ for all $x$ and $\epsilon$. In that case, let's assume that $\min_x f(x) = f(x_0) = m$. Then it is obvious that $m(\epsilon)=\min_x g(x,\epsilon) \geq m$. On the other hand $m(\epsilon)\leq g(x_0,\epsilon)$. Now let's rewrite these inequalitites and take the limit $m \leq\lim_{\epsilon\rightarrow 0}m(\epsilon)\leq \lim_{\epsilon\rightarrow 0}g(x_0,\epsilon) = f(x_0)=m$. As you can see the limit of $m(\epsilon)$ is sandwiched and is equal to $m$.

Another case in which the argument will be true is if the convergence of the family of functions is uniform over the space of $x$.