What does the data is within $0.5\sigma$ mean?
I want to ask about a term about "this data is within $0.5\sigma$". If I understand correctly the person first calculates standard deviation($\sigma$) of his data and multiples his/her standard deviation by 0.5, and excludes those data points outside of this $0.5*\text{std_data}$ value. Is this correct?
And maybe is this $0.5\sigma$ range also related to quantile range (say the range between $0.05$ and $0.95$ quantile of the data)
which one is correct? maybe I'm all wrong:)
Essentially, it is the first. To state that data is within $0.5\sigma$ means that it is within half of a standard deviation from the mean value.