How do I state that a data set has a 'denser' standard deviation?

78 Views Asked by At

Suppose I have two algorithms that produce numerical data.

The first algorithm produces { 2452, 695, 318, ... } with a mean of 1155 and a standard deviation of 1138.

The second algorithm produces { 35036, 29720, 31744, ...} with a mean of 32170 and a standard deviation of 2683.

I would like to formally describe the two results in terms of how 'dense' the output is about the mean. The first algorithm produces a lower standard deviation of 1138, but intuitively, this value is close to the mean of 1155, so my gut is that the distribution is very wide. On the other hand, the second algorithm produces a larger standard deviation of 2683, but this value is small compared to the mean of 32170.

What is the correct formal description of this 'denseness' of standard deviation around the mean? What can I formally say when comparing algorithm 1 and 2?