I am trying to model a phenomenon where the output depends linearly on a variable $x$, except when $x$ becomes large where its contribution is less important.
I would like to use a function looking like:
- $f(x) \sim x$ for $x < x_0$
- $f(x) \sim \alpha x $ for $x_0 < x < x_{max}$ ($ 0 < \alpha < 1$)
- Ideally, $f(x) = $ constant for $x > x_{max}$
with three parameters ($x_0$, $\alpha$ and $x_{max}$).
But I would like the function to be derivable everywhere except maybe on 0 (minimization) and slope changes to be smoother (ok this last criteria is just because it is nicer :) ).
I have investigated
- $argsh$ but I have no freedom on the second slope (aka $\alpha$)
- generalized logistic functions but couldn't find a parametrisation that make it close to identity for small $x$.
If someone do have an idea about a function looking like this, it would be much appreciated!
Thanks.
Piecewise linear functions can be approximated using for example $\theta_n(x) = \frac1n \ln(1+e^{nx}),$ which are good approximations of $$\theta(x) := \begin{cases}0 & \text{if $x<0$} \\ x & \text{if $x>0$} \end{cases}$$
Your function can be written exactly as $$f(x) = x - (1-\alpha)\theta(x-x_0) - \alpha\theta(x-x_{\text{max}})$$ and therefore approximated as $$f(x) \approx x - (1-\alpha)\theta_n(x-x_0) - \alpha\theta_n(x-x_{\text{max}})$$
Test different values of $n$ to see what is good enough for you.