There are many types of measures of variation/dispersion from the mean, but standard deviation can be confusing.
If we are measuring error/variability, then 1σ is better than 2σ. That is to say that most data are lying within only 1σ, is better than if the data were scattered within 2σ.
_________ . _________ . _________ . _________ . _________
BUT, if we are measuring quality standard - i.e. acceptable range of deviation from the mean - then higher acceptable deviations, the better.
the larger range of acceptable deviation, the less error:
μ ± 1σ → error 1 in 3
μ ± 3σ → error 1 in 370
μ ± 6σ → error 1 in 506797346
http://en.wikipedia.org/wiki/68%E2%80%9395%E2%80%9399.7_rule#Higher_deviations
Is this a correct description of what standard deviation measures in quality measurement?
All comments and answers are welcome.
Thank you so much for everyone who will answer.
There are at least two senses in which you can use the standard deviation in quality measurement:
The more fundamental use of the standard deviation is in (1), where you are characterizing how well-controlled your manufacturing process is. In this case, the larger the standard deviation is, the lower the quality of your manufacturing process. This is regardless of the actual standards you need to meet - if process A has higher standard deviation than process B, than for any tolerance interval $\pm a$ mm, process A will generate more bad products than process B.
Now, the second use of standard deviation, (2) from above, is probably what is causing you confusion. In this case, you are correct that more standard deviations indicate a higher quality process, but these standard devaitions are NOT the same as in (1). To illustrate, imagine that you have a process that produces bearings with mean diameter 0.2 mm with standard deviation 0.01 mm (and normally distributed). Now, your tolerance interval is $\pm .05$ mm. How many standard deviations does $\pm .05$ represent, given that you observed a process standard deviation of 0.01 mm? In this case, you would say that your process is $5\sigma$, in that the predetermined tolerance limits represent 5 standard deviations away from your mean process output (which I assumed to be unbiased), as calculated from the underlying process data.
So, you see, you want the standard devation of your manufacturing process to be low, which will increase the number of standard deviations that can fit in your pre-specified tolerance interval. The two uses of $\sigma$ do not mean the same thing.