Optimizing recursive functions with time series data

37 Views Asked by At

I have a recursive function, $f(0,a)$ is known, $f(t+1;a,b)=f(t;a,b)+g(t,b)$ where $a,b$ are constants and $g$ is a function. I also have a sequence of data, $D(t)$. I am trying to optimize $f$ with respect to $a,b$ such that:

$argmin_{a,b} \frac{1}{N} \sum^N_i (D(i) - f(i;a,b))^2$

Is there a way to optimize for recursive functions such as these? I don't think I can take the derivative wrt. $a,b$ because the function gets more complicated as $t\to \infty$

1

There are 1 best solutions below

0
On

The natural alternative is optimization over the complete state-sequence, i.e. introduce states $x_i$ as decision variables and then minimize $\sum (D_i-x_i)^2$with constraints $x_{i+1} = f(x_i,a,b) + g(b)$. This is how we often formulate finite-horizon control and estimation problems over dynamical systems in the control community. Many variables, but very sparse and structured.

Another alternative is to see the objective as a black-box, i.e. use a solver where you simply give objective evaluation for current iterate of $(a,b)$. Few variables but information is lost as you do not supply derivative sothe solver has to resort to numerical differentiation (although a least-squares problem in $R^2$ should be trivial unless there is something nasty in the model).

But most importantly, when you say you cannot derive the derivative, do you mean you cannot derive the derivative, or do you mean automatic differentiation packages will fail if they are used in a solver equipped with this?

Also, there are methods to compute the gradient specifically for problems of this nonlinear least-squares type estimation problem without explicitly working with the functional expansion of the expression.