Function that converges to zero but can be stretched

28 Views Asked by At

To implement a Recursive Least Squares algorithm I am looking for a function with the shape: $$ y(x,t)= x + \omega _{1}*f_{1}(x,t) + \omega _{2}*f_{2}(x,t) ... + \omega _{n}*f_{n}(x,t) $$ Each function is weighted by a factor I seek to optimize. The shape of the graph should start at value "x" for t=0 and converge towards zero over time, however I want to influence and optimize how fast the function converges. One obvious example to show what I mean would be: $$ y(t)= x - \omega _{0}*t $$ With a higher weight-factor this function decreases faster over time and with a weight factor close to zero it takes longer. But obviously this function does not converge towards zero. So that's what I am looking for.

Additionally, at the end I want to integrate the terms, which caused me to get stuck when I tried something like: $$ y(x,t)= x -min(x,t) $$ The integral is supposed to look like this

It feels like the solution shouldn't be that hard but I just cannot find it. So any help is appreciated.