What is the Average IQ for children, teens and adults?

Many people have written in to ask whether their IQ falls within the average. Let me first start by laying the ground work before explaining what constitutes average IQ.

When ‘modern day’ IQ tests were first developed in the late 1890s by Frenchman Alfred Binet, he developed a simple equation for establishing someone’s IQ. The intelligence quotient (IQ) was presented as follows:

IQ = (MA / CA) x 100

Where MA was the Mental Age of the test taker. That is, MA corresponded to how well the test taker did relative to children of different ages, while CA was very simple the test taker’s Calendar Age, or age.

So if a seven year old  (CA = 7) , on a multitude of cognitive ability tests was performing at the level of the average five year old (MA = 5), then this seven year old would have an IQ as follows:

IQ = (5 / 7) x 100 = 71 points. In other words, this little boy was slow for his age.

If another seven year old boy performed at a level which was consistent with his seven year old peer group, on a cognitive assessment, then CA = MA = 7 and his IQ would be computed as follows under Binet:

IQ = (7/7) x 100 = 100. So this little boy would be average for his age with an IQ of 100.

If however, a little girl was also seven (CA = 7) but was able to perform at the level of the average eight-year-old (MA = 8), then this little girl’s IQ would be:

IQ = (8 / 7) = 114. So this little girl would be advanced for her age.

Based on this early quotient, for calculating IQ, the average  level of IQ would have corresponded to someone who had a cognitive ability that was consistent with his or her age group. So that person’s IQ would have been 100. If MA = CA, then IQ = 100.

In the 1930s, Weschler –  another IQ test inventor – would go on and replace the old quotient scale with standard scores. This was made possible because it was established that IQ scores  in general were normally distributed. And the beauty of the normal distribution is that it can be described using only two parameters: Mean and standard deviation.

With standard scores, the mean or average IQ score was maintained at 100, while the standard deviation could vary depending on the test. No matter what the standard deviation of the test is however, we know that roughly 68% of scores will lie within one standard deviation of the mean, while 98% of test results will fall two standard deviations from the mean score. These are very simply statistical properties of the normal distribution.

Today the most popular IQ tests will have a standard deviation of either 15, 16 or 24.

However. assuming a standard deviation of 15, psychologists have broadly agreed on the following with respect to IQ scores.

an IQ score below 70 is usually associated with cognitive impairment.

an IQ score between 71 – 80 is considered ‘borderline’ or well below average

a, IQ score between 81-90 is considered ‘low average’

an IQ score between 91 and 110 is considered ‘average’

an IQ score between 111 to 119 is considered ‘high average’

an IQ score between 120 and 129 is considered ‘superior’

and IQ score above 130 is considered ‘very superior’

So the question about average IQ depends on the context.

For instance, in terms of the population as a whole, an IQ score between 91 and 110 is considered by psychologists as being ‘average IQ’.

If I look at the same question in statistical terms, I could argue that 68% of the population is expected to have an IQ between 85 and 115, which means  that this could be argued to be ‘average IQ’.

But then again, the average IQ depends on one’s perspective. For instance, we know that medical doctors have an average IQ of 125. Which means that having an IQ of 100 would not be average among this group of professionals.

So average is a relative term. And the question becomes: “relative to what? or to whom?”

At iq-brain.devv.website, we have created a number of IQ tests which will position your IQ relative to the general population as a whole.

Click here to take our fluid intelligence test now.