The R/S algorithm is well known for finding the Hurst component of a given data set. I have one query about whether it is well defined?
Suppose in a given partition, all of the values were equal to some constant $C$. Then the cumulative deviation is non zero, finite and not constant (max and min are different), but the standard deviation is zero.
Therefore, the rescaled Range involves a '$RR/0$' which is not well defined and theres nothing online about what we should do here? It seems correct to ignore it, as the relative range is essentially infinite compared to the standard deviation?
Thanks
See Algo here: http://www.bearcave.com/misl/misl_tech/wavelets/hurst/
Good question!
Remember that the Hurst exponent gives an idea of the persistence of a time series. That is, the closer H is to zero the more anti-persistent or "mean reverting". If you have a standard deviation of zero, the Hurst exponent is also likely to be zero because it is mean reverting. In this sense, a standard deviation of zero is just a special case of an extremely anti-persistent time series.