# What are standard deviation percentiles?

woman holding a book

Standard deviation percentiles are used to determine the percentage of occurrences that are above or below the mean. In statistical analysis, the average of all numerical scores or occurrences is known as the mean. Since not all of the data collected will be equal to the mean, the standard deviation reflects how far most of this data will be from the mean. In normal distributions, 50 percent of the hits will be less than or greater than the mean of the dataset.

One of the most efficient ways to think of percentiles of standard deviation is as the number of occurrences that will be included in a range of numerical scores. For example, a set of final exam test scores might be obtained by a group of college students in an economics course. The average will represent the average score, and in most cases, a 50 percent percentile will be assigned. Test scores that fall within one or two standard deviations of the mean will usually be assigned a different percentile.

Standard deviation percentiles that fall below the mean in a normal distribution are less than 50 percent. Those that deviate above or to the right of the mean will be more than 50 percent. For example, if the average exam score is 70, scores that fall in the range of 71 to 81 can be assigned to the 75th percentile. Scores ranging between 59 and 69, on the other hand, would likely be within the 25th percentile.

Graphical displays of standard deviation percentiles are often used to determine the significance of a specific score. Individuals can use average salary statistics to see if a given income is significantly higher or lower than the average. For example, a salary that corresponds to the 90th percentile in a normal distribution means that the individual earns more than 90% of his peers. Standard deviation percentiles can also be grouped into spreads or ranges according to the mean of the dataset.