I'd like to make a custom loss function for a neural network that penalizes values outside the range $[0,1]$ preferably sharply either side.
It must be smooth and differentiable. I guess such a function could be a polynomial of appropriate degree. But I was wondering what could be the best suggestion on here by people most-likely more familiar with the subject matter.
Penalize $Y>1: f(Y)=-n*Y^{n}$, penalize $Y<0: f(Y)=2^nY$ for some large value $n$.