By every metric (SAT, ACT, and even IQ), the intelligence and aptitude of American students has declined in recent years. As Joy Pullmann documents in an article about the results of using Common Core, student performance on state standardized tests has also gone down — that is, when the scoring isn’t changed to show better results.
Before casting blame on who’s responsible, it’s important to understand what these numbers actually mean. All of these tests attempt to measure higher-order thinking skills as they relate to abstract scenarios, nothing more or less. They do not measure the level of practical skills people might have, their emotional intelligence, or even their actual knowledge of things.
Rather, they assess students’ critical thinking (the ability to analyze), problem solving, and knowledge application. People who have these higher-order thinking skills are generally more adaptable to any kind of learning, particularly the kind of learning that happens in higher education.
Higher-order thinking is also important in a time most activities are computerized and information comes in abstract forms such as images and text. American students who lack these skills will struggle much more when most jobs and other domains in life (communication, management, choosing a lifestyle, etc.) demand this of them.
Nature Versus Nurture
Debate abounds about whether higher-order thinking skills are a product of nature or nurturing, which explains the general responses about the decline revealed on aptitude tests. Those who believe high-level thinking come from genes and parenting attribute the recent decline to more dumb parents having children than smart ones — the “idiocracy hypothesis,” as Michael Knowles discusses and debunks on his podcast. This usually follows from the feeling that raw intelligence (to which people equate higher-order thinking when it is left undefined) cannot be taught.
Others believe, rightly, that changes in nurturing have led to the decline. Higher-order thinking is not raw intelligence, but more a habit of mind, and the children who practice it more can generally do it better. Scores on all these tests, including the IQ test, will change based on the person’s habits. When people read and listen more and have more leisure time for using their imaginations, they will improve in higher-order thinking, as they did for much of the 20th century.
If scores fall, this then indicates the problem lies in children’s changing habits (not changing demographics), which directly relates to their schooling and their home environment. Indeed, in the past decade, massive changes have occurred in these two areas: Public schools have adopted the Common Core approach to instruction, and homes have succumbed to screen addiction. To counteract these two major drags on young people’s intelligence, it is necessary to understand their nature and their causes — in other words, it requires using higher-order thinking.
Higher-Order Thinking Versus Lower-Order Thinking
Although Common Core is technically a set of learning mandates the federal government developed during the Obama administration, it is better to think of it as an all-encompassing mindset. Most states adopted the actual documents, but all states across the country adopted Common Core’s approach in some way.
This was because its writers promised their product would finally solve the problem of chronic mediocre academic performance, particularly with students of a lower socioeconomic background. Not only this, but it would do so without any costly intervention or reordering of education systems. Common Core, like the administration that endorsed it, worked smarter, not harder.
How would it do this? It would take a shortcut to higher-order thinking by eliminating lower-order thinking. Lower-order thinking mainly includes basic comprehension (literal-level understanding of texts or concepts) and rote memorization (internalizing concepts through repetition and drill). Common Core proponents dismissed these skills as mindless busywork that needed to go.
Thus, even for students in lower grades, they revamped materials and instruction so that all students could spend more time on higher-order thinking. Teachers would no longer ask them to recount the details of a story, practice multiplication tables, identify all the state capitals, or memorize the parts of a cell. Instead, they would evaluate the theme of a story, draw a table with stems and leaves illustrating what happens in a multiplication problem, create their own capital to their imaginary state, and consider the alternative ways of classifying life than with cells and organelles. In all the core disciplines, skills (the realm of high-order thinking) would take precedent over content (the realm of lower-order thinking)
What the Common Core writers did not seem to understand, however, was that higher-order thinking requires lower-order thinking. Students need to practice reading on a literal level before they look for deeper themes and arguments. They need to memorize and apply formulas in arithmetic before they can do more complex processes such as graphing and working out multiple-step solutions. And they need to know actual facts in science and social studies before they understand the theories and methods in these subjects. In short, they really do need the content before they learn the skills.
Turning to Technology
Some Common Core partisans (yes, they do exist, and they are still very influential in the education world) answered this objection by blithely asserting that technology could fill this content gap. Students could simply look up knowledge they needed from their taxpayer-bought iPads. After all, this is what adults do. No one memorizes things anymore, and everyone skims for the main ideas; this leaves their minds free to do the higher-order thinking the world requires of them.
Few people seemed to mind that serious studies never backed up these arguments or that most pedagogical theory and a century of testing data thoroughly refuted it. Educational leaders and their cronies in large textbook and technology companies could invoke labels such as “21st-century learning,” cite dubious education studies, and use up millions of taxpayer dollars to overhaul every school districts’ instructional materials. In countless districts, parents and taxpayers watched helplessly as their school district squandered rainy day funds and issued bonds for gimmicky teaching materials and useless technology.
Some may cynically respond that public education was already failing with the old curriculum, so Common Core can hardly be blamed for making it worse. In this one regard, the experts who wrote Common Core deserve credit: They actually did make things worse.
Kids went from reading at least a few easy books to reading no books, from doing basic math to doing no math, and having a spotty knowledge of historical and scientific concepts to, again, having none. In trying to boost higher-order thinking and make learning more efficient, the Common Core approach effectively discouraged all orders of thinking and inverted learning to potentially harm students’ intellectual development.
Unfortunately, as schools continue to deal with the damage Common Core’s false promises created, the situation has become even worse at home, where the intelligence of American students is attacked with screen addiction.
Screen-Addicted Children and Complicit Parents
Article after article teems with horrific statistics of the many hours children spend on their phones or computers, but parents continue to shrug it off. After all, the kids seem fine for the most part. True, they look like zombies and seem to lack a general awareness of anything, but they are pleasantly pacified for the most part. The alternative would be taking away their phones, and this would make them unhappy and make the parents who now have to deal with them unhappy as well.
To be fair to parents who make this argument, they are entirely right — in the short term. “Peppa Pig” and “Paw Patrol” cartoons do keep their children quiet and occupied, but it also keeps them from learning and growing. A screen addict, young or old, loses the ability to focus on anything for too long or think longer than a few moments. This then affects memory and comprehension, which finally affects all the higher-order thinking skills. The kids may be calmer, but they are also intellectually handicapped.
When families choose to moderate or outright eliminate screen time, students can do remarkable things. They can start reading, conversing, and experimenting. They can practice remembering things and managing their day. In many cases, it doesn’t require some expensive tutorial program or intense diet of classic texts to become intellectually competent. It simply requires discipline and common sense.
No Shortcuts in Learning
For most educators, the current dip in intelligence is nothing mysterious. The data merely confirms what everyone knows: There are no shortcuts in learning. This gives reason to hope because much of the damage can be reversed through simply recovering this truth, and some people have already started.
Some schools have successfully implemented a solution to a skill-focused Common Core by using the content-focused and much more intuitive approach elaborated by E.D. Hirsch. This is the main factor that accounts for the success at KIPP academies, classical schools, and other institutions that hold fast to old-fashioned pedagogy.
Many of these schools have also worked to fight screen addiction. Waldorf and Montessori schools famously stress a low-tech environment. Parents must resist the temptation to buy their child a phone and accept the ensuing chaos at home for a little while.
Any school district and household can make these changes at relatively little cost, at least materially speaking, but improvements will require significant effort and patience. If parents and educators can muster the will, they can finally stop, if not effectively reverse, the brain drain happening to today’s students and stave off the misery that would come from a nation of dunces.