What is the difference between range and standard deviation




















However, the range and standard deviation have the following difference:. For example, suppose a professor administers an exam to students. She can use the range to understand the difference between the highest score and the lowest score received by all of the students in the class. For example, if a professor administers an exam to students, she can use the standard deviation to quantify how far the typical exam score deviates from the mean exam score.

We can use both metrics since they provide us with completely different information. Both the range and the standard deviation suffer from one drawback: They are both influenced by outliers.

We can calculate the following values for the range and the standard deviation of this dataset:. Dataset: 1, 4, 8, 11, 13, 17, 19, 19, 20, 23, 24, 24, 25, 28, 29, 31, 32, We could use a calculator to find the following metrics for this dataset:. The most awesome thing about standard deviation is that we can use it not only to describe data but also conduct further analyses such as ANOVA or multiple linear regressions. Standard deviation is a reliable method for determining how variable the data is for both a sample and a population.

Of course, we cannot truly know the standard deviation for a population, but with the standard deviation of a sample, we can infer it. The deviation is how much a score varies from the overall mean of the data. In the case of our example data, it would be how much each value differs from the mean of We generally use s to represent deviation. For our data the deviation is. Just like the range, the larger the difference between the highest and lowest values, the greater the deviation and the higher the variability.

On a side note, your deviations should always add up to zero. It may seem odd that the deviation scores add up to zero, but the standard deviation may be a non-zero value. This is because of the way that standard deviation is calculated. Standard deviation is calculated as a sum of squares instead of just deviant scores. The formula for standard deviation looks like.

So, for our X 1 dataset, the standard deviation is 7. This represents a HUGE difference in variability. The standard deviation for X 2 is 1. Another great question, and one that I wish I had a hard and fast answer for. In general, the closer your standard deviation is to zero, the less variability there is in your data. An example is: 45, 45, 45, 45, 45 In this case, the Range is 0. Since they're all the same values, there is no deviation, so the SD is also 0.

If the data set were: 43, 44, 45, 46, 47, the range here is 4. You can see the data is more spread out than the first example, so the SD will be larger than 0 in this case, it's about 1. This set of data is even more spread out; the SD is 3. However, range is not tied to the SD. If the data set were: 41, 45, 45, 45, 49, the range is still 8, but the distribution is less spread out, so the SD would be less than 3.



0コメント

  • 1000 / 1000