Looking for an alternative to a perceptron that can be deformed to the identity map

43 Views Asked by At

Trying to solve a numerical problem that requires a (one-time) differentiable function mapping numbers (in $[0,1]$) to $[0,1]$ to calibrate scores out of a machine learning binary classifier. I have been using a perceptron:

$$x \mapsto \frac{1}{1 + \exp (-(ax + b))}$$

with $a, b$ fitted to minimize some cost function.

It's doing most of the job except for not being reducible to the identity map for some values of $a, b$. The root problem is that I need to regularize (prevent over-fitting) by penalizing by the distance to the identity map.

Can you suggest some function templates:

  • similarly shaped to a S-curve,
  • differentiable (at least once),
  • deformable to the identity map.
1

There are 1 best solutions below

1
On BEST ANSWER

A family with these properties is $$f_\alpha(x) = \dfrac{x^\alpha}{x^\alpha + (1-x)^\alpha}\quad \alpha\geq1.$$

It equals the identity at $\alpha=1$ and then it has the shape of a sigmoid when $\alpha>1$, the bigger the $\alpha $ the larger the slope.

In your original example, the action of the variable $b$ is to move the curve left-right on the axis. If you need that, you can use

$$f_{\alpha,b}(x) = \dfrac{(x-b)^\alpha}{(x-b)^\alpha + (1-x+b)^\alpha}\quad \alpha\geq1,$$ where $b$ plays the same role here, moving the family left-right.