The Variance and Standard Deviation

A statistic that provides key information about a distribution is the variance, which indicates the variability of a group of measurements (how spread out the distribution is). Distributions may have the same mean but different variances (I Figure 22.9). The larger the variance, the greater the spread of measurements in a distribution about its mean.

The variance (s2) is defined as the average squared deviation from the mean:

To calculate the variance, we (1) subtract the mean from each measurement and square the value obtained, (2) add all the squared deviations, and (3) divide this sum by the number of original measurements minus one.

Another statistic that is closely related to the variance is the standard deviation (s), which is defined as the square root of the variance:

Whereas the variance is expressed in units squared, the standard deviation is in the same units as the original measurements; so the standard deviation is often preferred for describing the variability of a measurement.

A normal distribution is symmetrical; so the mean and standard deviation are sufficient to describe its shape. The mean plus or minus one standard deviation (x ± s) includes approximately 66% of the measurements in a normal distribution; the mean plus or minus two standard

Mean x

Mean x

0 0

Post a comment