Smooth, differentiable loss function 'bounding' $[0,1]$

81 Views Asked by At

I'd like to make a custom loss function for a neural network that penalizes values outside the range $[0,1]$ preferably sharply either side.

It must be smooth and differentiable. I guess such a function could be a polynomial of appropriate degree. But I was wondering what could be the best suggestion on here by people most-likely more familiar with the subject matter.

1

There are 1 best solutions below

2
On BEST ANSWER

Penalize $Y>1: f(Y)=-n*Y^{n}$, penalize $Y<0: f(Y)=2^nY$ for some large value $n$.