How to determine if Standard Deviation is high/low

132.9k Views Asked by At

I have derived the following response time data for a performance test I am running:

Min - 8sec Max - 284sec Average - 28sec Standard Deviation - 27sec

What does the standard deviation say about the response time data distribution? When you say low/high standard deviation, what does this actually mean? Is this in comparison to the Average/Min/Max?

I know what standard deviation is and how it's computed. I'm just not sure how to tell if it is high or low.

2

There are 2 best solutions below

4
On BEST ANSWER

If you take your cues from the financial industry, you can use the coefficient of variation (CV), which is the standard deviation / mean. This formula is used to normalize the standard deviation so that it can be compared across various mean scales.

As a rule of thumb, a CV >= 1 indicates a relatively high variation, while a CV < 1 can be considered low.

Some references to usage as "rule of thumb"

http://www.readyratios.com/reference/analysis/coefficient_of_variation.html

http://www.mhnocc.org/forum/index.php?t=msg&goto=234&

0
On

When trying to figure this out myself I opted for using the Std deviation as a percentage of the range. Looking at this graph:

enter image description here

from that image I would I would say that the SD of 5 was clustered, and the SD of 20 was definitionally not, the SD of 10 is borderline.

More mathematically,

  • The SD of 5 has 68% of the values within 10% of the range
  • The SD of 10 has 68% of the values within 20% of the range
  • The SD of 20 has 68% of the values within 40% of the range

So as a purely internal measure of High / Low Std deviation I chose to say if the SD was less than 10% of the range then its low, greater than 10% of the range then high.

But you could of course choose different percentages based on your own data sets.

https://www.scribbr.com/statistics/standard-deviation/