I have a function $f: \mathbb{R} \to \mathbb{R}$ and as time goes by I get an input $x_t$ and, by feeding it through $f$ I obtain an output $y_t = f(x_t)$. There is no noise, this is a deterministic function.
How can I approximate this function at time step $\tau$? What is this task called? (online regressio?). What are some example algorithms to perform this and what is the current state of the art?
I would like to find a method that can be udated at each time step.This is what I mean:
- $t=1$: I receive $x_1$ and I obtain $y_1 = f(x_1)$. I use the pair $(x_1, y_1)$ to learn a first approximation $\hat{f}_1$
- $t=2$: I receive $x_2$ and I obtain $y_2 = f(x_2)$. I use $\{(x_1, y_1), (x_2, y_2)\}$ to update $\hat{f}_1$ to $\hat{f}_2$.
- $t=\tau$: I receive $x_\tau$ and I obtain $y_\tau = f(x_\tau)$. I use $\{(x_t, y_t)\}_{t=1}^{\tau}$ to update $\hat{f}_{\tau-1}$ to $\hat{f}_\tau$
Check out the method of divided differences which approximates $f$ with a polynomial. Each of your coefficients can be updated pretty quickly each time you add a single point.