Given a simple data set, the two standard deviation is calculated as a negative number. Shouldn't the standard deviations (min or max) be within the range of the data set?
My data set: 2 31 32 22 10 43 12 46 48 51 51 52 82 88 10 01 07 12 31 40 15 41 30 Average: 66.2 Standard Deviation: 45.02 Min second Standard deviation: -23.8 (average - stddev x 2) Max second standard deviation: 156.20 (average + stddev x 2)
https://simple.wikipedia.org/wiki/Standard_deviation#/media/File:Standard_deviation_diagram.svg
Assuming your calculations are correct you have a mean $\mu=66.2$ and standard deviation $\sigma=45.02$ so $\mu-2\sigma=-23.84,\,\mu+2\sigma=156.24$. I've never hard anyone call $\mu-2\sigma$ the "second standard deviation", and it has no theoretical reason to be $\ge 0$.