Technology

American IQs rose 30 points in the last century. Now, they may be falling.

A new study of human intelligence posits a narrative that may surprise the general public: American IQs rose dramatically over the past century, and now they seem to be falling. 

Cognitive abilities declined between 2006 and 2018 across three of four broad domains of intelligence, the study found. Researchers tracked falling scores in logic, vocabulary, visual and mathematical problem-solving and analogies, the latter category familiar to anyone who took the old SAT.  

In the 12-year span, IQ scores dipped up to 2 points in the three areas of declining performance. Scores declined across age groups, education levels and genders, with the steepest drops among younger and less-educated test-takers.  

IQ scores rose in just one area, spatial reasoning — a set of problems that measure the mind’s ability to analyze three-dimensional objects.  

The study, authored by researchers at Northwestern University and the University of Oregon, appears in the May-June issue of the journal Intelligence. 


Researchers across the globe have been tracking an apparent decline in human IQs, starting around the turn of the millennium. Theories abound as to why scores are dropping, but the smart money says our cognitive skills may have plateaued, teetering into an era of intellectual lethargy. 

If you want to ascribe blame, look no further than this screen.  

Cognitive researchers hypothesize that smartphones and smart speakers, autocomplete and artificial intelligence, Wi-Fi and runaway social media have conspired to supplant the higher functions of the human brain. In its quest for labor-saving tech, the world may be dumbing itself down.  

“We’re all getting super lazy in our cognition because it’s getting super easy to do everything,” said Ruth Karpinski, a California psychologist who studies IQ. “We’re using Waze and Google Maps to get where we need to go. We’re losing our whole sense of compass.” 

The new study joins a growing body of research on something called the Flynn effect. James Flynn, a New Zealand intelligence researcher, tracked a dramatic rise in IQ scores across the 20th century.  

How dramatic? If you gave an early-1900s IQ test to a person of average intelligence in 2000, the test-taker would rate in the top 5 percent of the Teddy Roosevelt-era population in cognitive ability. 

“IQs rose all over the world, over the course of the century, about 30 points,” said Robert Sternberg, a psychology professor at Cornell who studies intelligence. “To give you a sense of what 30 points means, the average is 100. Usually, gifted would start around 130. So, we’re talking about the difference between an average IQ and a gifted one.” 

Until recently, IQ scores had been rising for nearly as long as we had IQ tests. The first tests emerged around 1905, tailored for struggling schoolchildren in France.  

Intelligence testing in the United States “took off after World War I” as a selection tool in the military, said Stefan Dombrowski, a psychology professor at Rider University who studies IQ. “We needed some means to distinguish the cannon fodder from officer material.” Army IQ tests evolved into the SAT, which became the dominant college entrance exam, a rite of passage for the upwardly mobile.  

Flynn’s research found that IQ scores rose by 3 to 5 points per decade, a finding that implied humans were getting gradually smarter. Cognitive researchers spent decades debating the reasons for the Great Smartening.   

“Some people thought it was nutrition, some thought it was schooling, some thought it was better parenting,” Sternberg said. Over the course of the 20th century, Americans generally ate better, stayed in school longer and refined brutalist parenting techniques.  

Flynn himself believed rising IQs reflected the growing complexity of human affairs. As farms gave way to factories, horses to cars, typewriters to computers, “it became more complicated to live in the world,” Sternberg said. “And so, IQs went up.” 

To some researchers, the Flynn effect undermines the credibility of the IQ. Human intelligence is supposed to be largely innate, inherited, fixed. How, then, to explain the 30-point rise, an arc that seems to have transformed the average American into Wile E. Coyote? 

“The level of gain makes it absurd to continue to believe that the tests are measuring some innate, supremely important, fixed quantity that is central to human existence,” Sternberg said. 

The Flynn effect suggests that humans can, indeed, learn the skills measured on IQ tests. 

“Children, I believe, have gotten better at taking tests,” said Donna Ford, distinguished professor of education and human ecology at the Ohio State University. “They’ve gotten wiser at taking tests because they’re so used to taking tests.” 

Test-taking skills suffused American culture well before the No Child Left Behind Act of 2001, perhaps the high-water mark of high-stakes testing in American schools.  

Another perennial concern with intelligence testing is racial and socioeconomic bias. Some historians argue that the testing movement is rooted in sustaining white privilege.  

“When these tests are made,” Ford said, “the items favor white, middle-class populations.”  

She cites an old IQ test item that asked children to explain the saying, “One swallow does not make a summer.” Children from low-income urban homes often thought the question alluded to human digestion. 

Whatever its flaws, the IQ test remains a fixture in American culture, used in various forms to document cognitive decline in Alzheimer’s and Parkinson’s patients, to flag children for special-education instruction and to spare mentally disabled inmates from execution.  

“IQ is the most researched construct in all of psychology,” Dombrowski said. Innumerable studies have linked higher or lower IQ scores to better or worse life outcomes, from academic attainment to career success. Colleges and employers often select candidates with high scores on ersatz IQ tests, whereupon the benefits of a high IQ become a self-fulfilling prophecy. 

And now, with IQ scores appearing to fall, researchers are embracing the notion that the Flynn effect may have run its course. 

“The line can’t go up forever,” said Elizabeth Dworak, lead author of the new study. “It’s called the ceiling effect. You eventually hit that threshold.” Dworak is a research assistant professor of medical social sciences at Northwestern’s Feinberg School of Medicine. 

Previous research hinted that IQ scores may have plateaued around the turn of the millennium. One Finnish study found IQ scores had dipped by 2 points between 1997 and 2009. A French study found a 4-point drop from 1999 to 2009. 

Perhaps human society has reached a cognitive peak. Computers and smartphones are more complex than ever, but human routines are oddly simpler. Generations ago, dishwashers and clothes dryers eased physical labors in daily life. Today, iPhones and the Amazon Echo eliminate mental labors.  

“When you can just say, ‘Alexa, do this,’ or, ‘Siri, do that,’ as software becomes more and more sophisticated, there’s less that the human mind has to do,” Sternberg said. 

The new study charted a decline in IQ scores using scores from the Synthetic Aperture Personality Assessment Project, a free, survey-based online test taken by nearly 400,000 Americans. 

The research design was far from perfect. The test-takers were somewhat better-educated than the overall U.S. population. The test wasn’t delivered in a controlled setting. Test subjects “might have been sitting on the bus,” Dworak said.  

A 2018 write-up in the Washington Post sent a flood of readers to the test-taking website. “The sample started skewing older,” Dworak said.  

If cognitive ability has indeed plateaued, then the next logical question is: When? Dworak would like to analyze scores by birth year.  

One day, perhaps, researchers will identify a specific moment — the launch of Wi-Fi, or the rollout of the first smart speaker — when American intelligence turned south.