I am looking at analyzing data collected from electronics embedded in industrial machinery. 2 years worth of data samples relating to the accuracy of the part are to be fit to a cubic function. That is:
f(t) = a + b.t + c.t^2 + d.t^3
is used as an approximation of how the accuracy drifts over time and a curve fitting algorithm is applied to the data to choose the best fitting values for parameter a to d.
All good, but this is to be done for millions or more such parts, which leads me to ask, is there a curve fitting algorithm which can do this incrementally? Or is such a thing not possible?
By incrementally, I mean suppose I had 2 years data today and ran the algorithm over all of that data to choose parameters a to d. Tomorrow I get another data point, can I use that new data point to calculate accurate updated values for a to d, without using all of the data? That is, just using the new data point and the values of a to d for yesterday? If it is possible, it should reduce the amount of computing power and shifting around of data that will be needed to calculate on the full data set for millions of parts.
If it is not possible for this cubic curve, are there other curves with fitting algorithms that can work incrementally?