I was doing some calculations in MATLAB, and noticed a pattern that may be obvious to stats experts, but I didn't notice it before.
If I have a time-series, and I remove a linear trend / detrend it, I find that the mean of the detrended data is zero, as expected.
However, I was surprised to find out that the standard deviation of both the detrended data and original data is roughly the same. Is this what one should expect?
Thanks!