Consider the piece-wise objective function, for $x \in \mathbb{R}^n$,
\begin{cases} f(x) = \frac{1}{2} (g(x) - d)^2 & g(x) \geq d \\ f(x) = 0 & g(x) < d \end{cases}
Where $d\in \mathbb{R}$ is a constant and, $g(x) : \mathbb{R}^n \mapsto R$. Let's assume, for simplicity, that $g(x)$ is convex (in practice it is only guaranteed to be convex for some neighborhood around the optimum).
This function is $C^1$, but not $C^2$ (e.g. its Hessian jumps discontinuously from an identity matrix to a $0$ matrix for many choices of $g(x)$.
This can cause numerical problems as I start to deal with sums of these functions, perhaps only over subsets of my total optimization coordinates.
So let's say I'd like to construct a new objective function, with the following properties:
- At $g(x) = d$, $f(x), f'(x), and f''(x)$ go to the same value at either side of the condition.
- $\min f(x) = 0$
- For $g(x) > d$, $f(x) > 0$
Ideally, I'd like to keep this as close to a quadratic function as possible for optimization (and implementation) purposes, but am willing to modify its structure as necessary. Let's assume convexity is only desired locally for some neighborhood around the optimum $g(x) = d$. Do such functions exist? If so, what is the formulation? If not, why not?