SD / σ

Standard Deviation 

SD / σStandard Deviation (SD, also represented by the Greek letter sigma, σ) is a mathematical statistical term, which is a measure of how much a set of data varies from the mean value of the data. In many real-world applications, groups of observations of quantities, such as people’s height, incomes, amount of television watched, etc, follow distributions. If you count the number of occurrences of particular values, some values are more likely than others. While there are different distributions for different quantities, the most common distribution is the so-called normal or Gaussian distribution. This distribution reaches a maximum number in the middle at the most likely outcome (known as the mean) and has the same shape above and below the mean value. The width of the curve is important and is measured through a parameter called the standard deviation.

If the data lies within a narrow band of values then the SD / σ will be small. Conversely, if the data has a wide range of values then the SD / σ will be large. For a random variable that has a normal distribution, 68% of the values lie within one SD / σ either side of the mean, 95% lie within two SD / σ either side of the mean and 99.7% lie within three SD / σ either side of the mean. Applications of relevance to inspection include error of measurement – the error in measuring the length or depth of defects with a particular technique. A highly accurate technique would exhibit small measurement errors and the SD / σ of a group of measurements would also be small.

For more information on SD/σ see:

What the hec?! articles are not intended to be the definitive account on the topic or acronym in question. Readers’ comments and contributions are welcomed. Email: