I have a list of the average price of an item in a game over time. Things don't tend to move much. I am wondering how I can detect whether a new value inserted is a surprising movement in price.
I assume the finance area has a lot of models for this kind of stuff. I'm looking for something simple and possibly already in a mathematics library for any given language
There isn't a best answer to this question until you define exactly what you want first. But one simple method to do this is to work out the mean and variance of all (or only more recent ones, or perhaps even a Recursive Least Squares estimate) past values. This can be done with a simple algorithm, e.g. for the mean
$\langle x \rangle_i = \frac{n-1}{n}\langle x \rangle_{i-1} + \frac 1 n x_i$
Say your code has the estimate of the mean as
meanand the estimate of the variance asvariance, then one way to measure the 'surprise' could be:Unfortunately, this is not an unbiased estimator. If you want it to be unbiased, you should replace
sqrt(variance)with the mean deviation of your values. For the estimator to be unbiased, we need $E[surprise]=1$.A more advanced method would involve some machine learning techniques. You could form a statistical model of the values you are obtaining by modelling the values as a Gaussian process (for example). Then this would allow you to calculate the probability of the next data point being a certain value. The lower the probability of the value you actually get, the more surprising it is