I am trying to form a method for finding if plotted data is in equilibrium. In most cases calculating linear least squares and checking for the slope is enough. However I am now plotting data that span over 15 orders of magnitude from 1 down to ^-15. Well down there the slope is going to be very small in any case so just evaluating if the slope is smaller that something will not work.
How do I check for relative change that doesn't depend on the magnitude of the value. Do I just divide the change (of duration z) by the average of Y values of the duration z? Intuitively that seems to be some sort unitless relative change in the duration z.
1 to 0.9 in one second = 10.5% change/s
0.1 to 0.09 in one second = 10.5% change/s
0.000 000 000 000 001 to 0.000 000 000 000 000 9 in one second = 10.5% change/s
Is it this simple or am I missing something?
I take amount z of points covering the desired span of time from the end of the data.
Adjust x/time values so the data starts from 0
Calculate linear least squares
Relative change in the given sample duration = ((Slope * Duration + Offset) / Offset) - 1
(Thanks Henrik)