I am writing a program to measure the error of a neural network training and want to know when a particular neural network has reached its optimum by measuring its error rate over a period of time. If I continue to train that neural network for X more iterations, and the error rate doesn't change much beyond a certain range, I call the training has gone 'stale', and therefore it must not continue.
Suppose the error rate is $x$, and $i$ is the iteration, and $x_i$ is the error at iteration $i$ measured by the backpropagation method. Function $f(i)$ measures the regression thus far, and $|x_i - f(i)|$ measures the staleness at iteration $i$ (should be getting closer to 0 as $i$ increases). I am using a simple average to measure the regression now, but I am looking for something that's a bit more robust. If $x_0$ is a very large number like 100000, I don't want 100000 more iterations with $x$ close to 0 to conclude staleness.
This is the function if I just averaged the errors.
$$f(1) = x_1$$ $$f(i) = \frac{f(i-1)*(i-1) + x_i}{i}$$
The training has gone stale if $|x_i - f(i)| < C$ and $i > I$, where C is the staleness constant, e.g 0.01, and $I$ is minimum number of iterations before the staleness check kicks in.
Apologize in advance if title is confusing. I am not sure how to word the problem better.
Going to answer my own here. The answer I am looking for is Exponential Moving Average.
$$f(1)=x_1$$ $$f(i)=\alpha*x_i+(1-\alpha)f(i-1)$$
Then to check if training has gone stale, I just need to check if $|x_i-f(i)| < \epsilon$.