Standard deviation and variance

Video masterclass

Topic summary

Standard deviation and variance are fundamental measures of spread in a data set, indicating how much individual data points differ from the mean.

Variance

Variance quantifies the average of the squared differences from the mean. It provides a measure of how far the data points are spread out from the average value. The formula for variance (\(\sigma^2\)) is: \[ \sigma^2 = \frac{\sum (x_i - \mu)^2}{n} \] Where:
  • \(x_i\) is each data point.
  • \(\mu\) is the mean of the data set.
  • \(n\) is the total number of data points.

Standard deviation

The standard deviation is the square root of the variance and is expressed in the same units as the original data, making it more interpretable: \[ \sigma = \sqrt{\sigma^2} \]

Interpretation

A low standard deviation indicates that the data points are close to the mean, while a high standard deviation signifies that the data points are spread out over a wider range. This information is crucial in statistics, as it helps assess the reliability and variability of data.

Extra questions (ultimate exclusive)

Ultimate members get access to four additional questions with full video explanations.