I am looking at a problem of constrained minimization, where the function to be minimized contains the Heaviside function, and as such is not twice continuously differentiable.
My question is what effect would the use of a twice continuously differentiable approximation to the Heaviside function have on the accuracy and efficiency of the optimization?
the problem takes the form
$$\min_k \sum_{i}^{N} k_{i}\left [ H\left ( x_{i} \right ) -T\right ],$$ $$\text{s.t.} \sum_{i}^{N} k_{i}\left [ x_{i} -R\right ] \le 0.01,$$
where $H\left(x \right ) $ is the Heaviside function
and $x,k,T,R\in\mathbb{R}$
Would the use of the approximate Heaviside
$H\left ( x \right )\approx \frac{1}{2}+\frac{1}{2}tanh(kx)=\frac{1}{1+e^{-2kx}}$
help with the minimization and give a sufficiently accurate result
or would a Legendre polynomial expansion (similar to http://www.phys.ufl.edu/~fry/6346/legendrestep.pdf) be more successful?
This approach will need to be extended into multiple dimension before implementation, for example in 3 dimensions the minimization becomes
$\sum_{i}\sum_{j}\sum_{l}k_{1,i}k_{2,j}k_{3,l}\left [ H\left ( x_{ijk}\right )-T \right ],$
and the constraint
$\sum_{i}\sum_{j}\sum_{l} k_{1,i }k_{2,j}k_{3,l}\left [x_{ijk} - R \right] \le 0.01,$
The minimization is covered by the question https://scicomp.stackexchange.com/questions/10447/constrained-minimization-in-n-dimensions, this question is regarding the effect of using an approximation to the step function in the algorithm (and ultimately the choice of algorithm to use)