Skip to content
Breaking News Alert Reports Show Nefarious Reality Behind Canada's Assisted Suicide Program

The Death Of Expertise

Image CreditBy Pete Prodoehl
Share

I am (or at least think I am) an expert. Not on everything, but in a particular area of human knowledge, specifically social science and public policy. When I say something on those subjects, I expect that my opinion holds more weight than that of most other people.

I never thought those were particularly controversial statements. As it turns out, they’re plenty controversial. Today, any assertion of expertise produces an explosion of anger from certain quarters of the American public, who immediately complain that such claims are nothing more than fallacious “appeals to authority,” sure signs of dreadful “elitism,” and an obvious effort to use credentials to stifle the dialogue required by a “real” democracy.

But democracy, as I wrote in an essay about C.S. Lewis and the Snowden affair, denotes a system of government, not an actual state of equality. It means that we enjoy equal rights versus the government, and in relation to each other. Having equal rights does not mean having equal talents, equal abilities, or equal knowledge.  It assuredly does not mean that “everyone’s opinion about anything is as good as anyone else’s.” And yet, this is now enshrined as the credo of a fair number of people despite being obvious nonsense.

What’s going on here?

I fear we are witnessing the “death of expertise”: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all. By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields. Rather, what I fear has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live.

What has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live.

This is a very bad thing. Yes, it’s true that experts can make mistakes, as disasters from thalidomide to the Challenger explosion tragically remind us. But mostly, experts have a pretty good batting average compared to laymen: doctors, whatever their errors, seem to do better with most illnesses than faith healers or your Aunt Ginny and her special chicken gut poultice. To reject the notion of expertise, and to replace it with a sanctimonious insistence that every person has a right to his or her own opinion, is silly.

Worse, it’s dangerous. The death of expertise is a rejection not only of knowledge, but of the ways in which we gain knowledge and learn about things. Fundamentally, it’s a rejection of science and rationality, which are the foundations of Western civilization itself. Yes, I said “Western civilization”: that paternalistic, racist, ethnocentric approach to knowledge that created the nuclear bomb, the Edsel, and New Coke, but which also keeps diabetics alive, lands mammoth airliners in the dark, and writes documents like the Charter of the United Nations.

This isn’t just about politics, which would be bad enough. No, it’s worse than that: the perverse effect of the death of expertise is that without real experts, everyone is an expert on everything. To take but one horrifying example, we live today in an advanced post-industrial country that is now fighting a resurgence of whooping cough — a scourge nearly eliminated a century ago — merely because otherwise intelligent people have been second-guessing their doctors and refusing to vaccinate their kids after reading stuff written by people who know exactly zip about medicine. (Yes, I mean people like Jenny McCarthy.

In politics, too, the problem has reached ridiculous proportions. People in political debates no longer distinguish the phrase “you’re wrong” from the phrase “you’re stupid.” To disagree is to insult. To correct another is to be a hater. And to refuse to acknowledge alternative views, no matter how fantastic or inane, is to be closed-minded.

How conversation became exhausting

Critics might dismiss all this by saying that everyone has a right to participate in the public sphere. That’s true. But every discussion must take place within limits and above a certain baseline of competence. And competence is sorely lacking in the public arena. People with strong views on going to war in other countries can barely find their own nation on a map; people who want to punish Congress for this or that law can’t name their own member of the House.

People with strong views on going to war in other countries can barely find their own nation on a map.

None of this ignorance stops people from arguing as though they are research scientists. Tackle a complex policy issue with a layman today, and you will get snippy and sophistic demands to show ever increasing amounts of “proof” or “evidence” for your case, even though the ordinary interlocutor in such debates isn’t really equipped to decide what constitutes “evidence” or to know it when it’s presented. The use of evidence is a specialized form of knowledge that takes a long time to learn, which is why articles and books are subjected to “peer review” and not to “everyone review,” but don’t tell that to someone hectoring you about the how things really work in Moscow or Beijing or Washington.

This subverts any real hope of a conversation, because it is simply exhausting — at least speaking from my perspective as the policy expert in most of these discussions — to have to start from the very beginning of every argument and establish the merest baseline of knowledge, and then constantly to have to negotiate the rules of logical argument. (Most people I encounter, for example, have no idea what a non-sequitur is, or when they’re using one; nor do they understand the difference between generalizations and stereotypes.) Most people are already huffy and offended before ever encountering the substance of the issue at hand.
Once upon a time — way back in the Dark Ages before the 2000s — people seemed to understand, in a general way, the difference between experts and laymen. There was a clear demarcation in political food fights, as objections and dissent among experts came from their peers — that is, from people equipped with similar knowledge. The public, largely, were spectators.

This was both good and bad. While it strained out the kook factor in discussions (editors controlled their letters pages, which today would be called “moderating”), it also meant that sometimes public policy debate was too esoteric, conducted less for public enlightenment and more as just so much dueling jargon between experts.

If experts go back to only talking to each other, that’s bad for democracy.

No one — not me, anyway — wants to return to those days. I like the 21st century, and I like the democratization of knowledge and the wider circle of public participation. That greater participation, however, is endangered by the utterly illogical insistence that every opinion should have equal weight, because people like me, sooner or later, are forced to tune out people who insist that we’re all starting from intellectual scratch. (Spoiler: We’re not.) And if that happens, experts will go back to only talking to each other. And that’s bad for democracy.

The downside of no gatekeepers

How did this peevishness about expertise come about, and how can it have gotten so immensely foolish?

Some of it is purely due to the globalization of communication. There are no longer any gatekeepers: the journals and op-ed pages that were once strictly edited have been drowned under the weight of self-publishable blogs. There was once a time when participation in public debate, even in the pages of the local newspaper, required submission of a letter or an article, and that submission had to be written intelligently, pass editorial review, and stand with the author’s name attached. Even then, it was a big deal to get a letter in a major newspaper.

Now, anyone can bum rush the comments section of any major publication. Sometimes, that results in a free-for-all that spurs better thinking. Most of the time, however, it means that anyone can post anything they want, under any anonymous cover, and never have to defend their views or get called out for being wrong.

Another reason for the collapse of expertise lies not with the global commons but with the increasingly partisan nature of U.S. political campaigns. There was once a time when presidents would win elections and then scour universities and think-tanks for a brain trust; that’s how Henry Kissinger, Samuel Huntington, Zbigniew Brzezinski and others ended up in government service while moving between places like Harvard and Columbia.

This is the code of the samurai, not the intellectual, and it privileges the campaign loyalist over the expert.

Those days are gone. To be sure, some of the blame rests with the increasing irrelevance of overly narrow research in the social sciences. But it is also because the primary requisite of seniority in the policy world is too often an answer to the question: “What did you do during the campaign?” This is the code of the samurai, not the intellectual, and it privileges the campaign loyalist over the expert.

I have a hard time, for example, imagining that I would be called to Washington today in the way I was back in 1990, when the senior Senator from Pennsylvania asked a former U.S. Ambassador to the UN who she might recommend to advise him on foreign affairs, and she gave him my name. Despite the fact that I had no connection to Pennsylvania and had never worked on his campaigns, he called me at the campus where I was teaching, and later invited me to join his personal staff.

Universities, without doubt, have to own some of this mess. The idea of telling students that professors run the show and know better than they do strikes many students as something like uppity lip from the help, and so many profs don’t do it. (One of the greatest teachers I ever had, James Schall, once wrote many years ago that “students have obligations to teachers,” including “trust, docility, effort, and thinking,” an assertion that would produce howls of outrage from the entitled generations roaming campuses today.) As a result, many academic departments are boutiques, in which the professors are expected to be something like intellectual valets. This produces nothing but a delusion of intellectual adequacy in children who should be instructed, not catered to.

The confidence of the dumb

There’s also that immutable problem known as “human nature.” It has a name now: it’s called the Dunning-Kruger effect, which says, in sum, that the dumber you are, the more confident you are that you’re not actually dumb. And when you get invested in being aggressively dumb…well, the last thing you want to encounter are experts who disagree with you, and so you dismiss them in order to maintain your unreasonably high opinion of yourself. (There’s a lot of that loose on social media, especially.)

All of these are symptoms of the same disease: a manic reinterpretation of “democracy” in which everyone must have their say, and no one must be “disrespected.” (The verb to disrespect is one of the most obnoxious and insidious innovations in our language in years, because it really means “to fail to pay me the impossibly high requirement of respect I demand.”) This yearning for respect and equality, even—perhaps especially—if unearned, is so intense that it brooks no disagreement. It represents the full flowering of a therapeutic culture where self-esteem, not achievement, is the ultimate human value, and it’s making us all dumber by the day.

Thus, at least some of the people who reject expertise are not really, as they often claim, showing their independence of thought. They are instead rejecting anything that might stir a gnawing insecurity that their own opinion might not be worth all that much.

Experts: the servants, not masters, of a democracy

So what can we do? Not much, sadly, since this is a cultural and generational issue that will take a long time come right, if it ever does. Personally, I don’t think technocrats and intellectuals should rule the world: we had quite enough of that in the late 20th century, thank you, and it should be clear now that intellectualism makes for lousy policy without some sort of political common sense. Indeed, in an ideal world, experts are the servants, not the masters, of a democracy.

But when citizens forgo their basic obligation to learn enough to actually govern themselves, and instead remain stubbornly imprisoned by their fragile egos and caged by their own sense of entitlement, experts will end up running things by default. That’s a terrible outcome for everyone.

Expertise is necessary, and it’s not going away. Unless we return it to a healthy role in public policy, we’re going to have stupider and less productive arguments every day. So here, presented without modesty or political sensitivity, are some things to think about when engaging with experts in their area of specialization.

  1. We can all stipulate: the expert isn’t always right.
  2. But an expert is far more likely to be right than you are. On a question of factual interpretation or evaluation, it shouldn’t engender insecurity or anxiety to think that an expert’s view is likely to be better-informed than yours. (Because, likely, it is.)
  3. Experts come in many flavors. Education enables it, but practitioners in a field acquire expertise through experience; usually the combination of the two is the mark of a true expert in a field. But if you have neither education nor experience, you might want to consider exactly what it is you’re bringing to the argument.
  4. In any discussion, you have a positive obligation to learn at least enough to make the conversation possible. The University of Google doesn’t count. Remember: having a strong opinion about something isn’t the same as knowing something.
  5. And yes, your political opinions have value. Of course they do: you’re a member of a democracy and what you want is as important as what any other voter wants. As a layman, however, your political analysis, has far less value, and probably isn’t — indeed, almost certainly isn’t — as good as you think it is.

And how do I know all this? Just who do I think I am?

Well, of course: I’m an expert.

Tom Nichols is a professor of national security affairs at the U.S. Naval War College and an adjunct at the Harvard Extension School. He claims expertise in a lot of things, but his most recent book is No Use: Nuclear Weapons and U.S. National Security (Penn, 2014). The views expressed are entirely his own.