Retired AF Guy
Army.ca Veteran
- Reaction score
- 931
- Points
- 1,160
A few years old but still relevant today. And while directed at the US it can apply as much to Canada or any other country in the world.
Part 1 of 2.
From Foreign Affairs March/April 2017
Part 1 of 2.
From Foreign Affairs March/April 2017
How America Lost Faith in Expertise and Why That’s a Giant Problem
By Tom Nichols March/April 2017
In 2014, following the Russian invasion of Crimea, The Washington Post published the results of a poll that asked Americans about whether the United States should intervene militarily in Ukraine. Only one in six could identify Ukraine on a map; the median response was off by about 1,800 miles. But this lack of knowledge did not stop people from expressing pointed views. In fact, the respondents favored intervention in direct proportion to their ignorance. Put another way, the people who thought Ukraine was located in Latin America or Australia were the most enthusiastic about using military force there.
The following year, Public Policy Polling asked a broad sample of Democratic and Republican primary voters whether they would support bombing Agrabah. Nearly a third of Republican respondents said they would, versus 13 percent who opposed the idea. Democratic preferences were roughly reversed; 36 percent were opposed, and 19 percent were in favor. Agrabah doesn’t exist. It’s the fictional country in the 1992 Disney film Aladdin. Liberals crowed that the poll showed Republicans’ aggressive tendencies. Conservatives countered that it showed Democrats’ reflexive pacifism. Experts in national security couldn’t fail to notice that 43 percent of Republicans and 55 percent of Democrats polled had an actual, defined view on bombing a place in a cartoon.
Increasingly, incidents like this are the norm rather than the exception. It’s not just that people don’t know a lot about science or politics or geography. They don’t, but that’s an old problem. The bigger concern today is that Americans have reached a point where ignorance—at least regarding what is generally considered established knowledge in public policy—is seen as an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to demonstrate their independence from nefarious elites—and insulate their increasingly fragile egos from ever being told they’re wrong.
This isn’t the same thing as the traditional American distaste for intellectuals and know-it-alls. I’m a professor, and I get it: most people don’t like professors. And I’m used to people disagreeing with me on lots of things. Principled, informed arguments are a sign of intellectual health and vitality in a democracy. I’m worried because we no longer have those kinds of arguments, just angry shouting matches.
When I started working in Washington in the 1980s, I quickly learned that random people I met would instruct me in what the government should do about any number of things, particularly my own specialties of arms control and foreign policy. At first I was surprised, but I came to realize that this was understandable and even to some extent desirable. We live in a democracy, and many people have strong opinions about public life. Over time, I found that other policy specialists had similar experiences, with laypeople subjecting them to lengthy disquisitions on taxes, budgets, immigration, the environment, and many other subjects. If you work on public policy, such interactions go with the job, and at their best, they help keep you intellectually honest.
In later years, however, I started hearing the same stories from doctors and lawyers and teachers and many other professionals. These were stories not about patients or clients or students raising informed questions but about them telling the professionals why their professional advice was actually misguided or even wrong. The idea that the expert was giving considered, experienced advice worth taking seriously was simply dismissed.
I fear we are moving beyond a natural skepticism regarding expert claims to the death of the ideal of expertise itself: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, teachers and students, knowers and wonderers—in other words, between those with achievement in an area and those with none. By the death of expertise, I do not mean the death of actual expert abilities, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors and lawyers and engineers and other specialists. And most sane people go straight to them if they break a bone or get arrested or need to build a bridge. But that represents a kind of reliance on experts as technicians, the use of established knowledge as an off-the-shelf convenience as desired. “Stitch this cut in my leg, but don’t lecture me about my diet.” (More than two-thirds of Americans are overweight.) “Help me beat this tax problem, but don’t remind me that I should have a will.” (Roughly half of Americans with children haven’t written one.) “Keep my country safe, but don’t confuse me with details about national security tradeoffs.” (Most U.S. citizens have no clue what the government spends on the military or what its policies are on most security matters.)
The larger discussions, from what constitutes a nutritious diet to what actions will best further U.S. interests, require conversations between ordinary citizens and experts. But increasingly, citizens don’t want to have those conversations. Rather, they want to weigh in and have their opinions treated with deep respect and their preferences honored not on the strength of their arguments or on the evidence they present but based on their feelings, emotions, and whatever stray information they may have picked up here or there along the way.
This is a very bad thing. A modern society cannot function without a social division of labor. No one is an expert on everything. We prosper because we specialize, developing formal and informal mechanisms and practices that allow us to trust one another in those specializations and gain the collective benefit of our individual expertise. If that trust dissipates, eventually both democracy and expertise will be fatally corrupted, because neither democratic leaders nor their expert advisers want to tangle with an ignorant electorate. At that point, expertise will no longer serve the public interest; it will serve the interest of whatever clique is paying its bills or taking the popular temperature at any given moment. And such an outcome is already perilously near.
A LITTLE LEARNING IS A DANGEROUS THING
Over a half century ago, the historian Richard Hofstadter wrote that “the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and comprehendingly perform for himself.”
In the original American populistic dream, the omnicompetence of the common man was fundamental and indispensable. It was believed that he could, without much special preparation, pursue the professions and run the government. Today he knows that he cannot even make his breakfast without using devices, more or less mysterious to him, which expertise has put at his disposal; and when he sits down to breakfast and looks at his morning newspaper, he reads about a whole range of vital and intricate issues and acknowledges, if he is candid with himself, that he has not acquired competence to judge most of them.
Hofstadter argued that this overwhelming complexity produced feelings of helplessness and anger among a citizenry that knew itself to be increasingly at the mercy of more sophisticated elites. “What used to be a jocular and usually benign ridicule of intellect and formal training has turned into a malign resentment of the intellectual in his capacity as expert,” he noted. “Once the intellectual was gently ridiculed because he was not needed; now he is fiercely resented because he is needed too much.”
In 2015, the law professor Ilya Somin observed that the problem had persisted and even metastasized over time. The “size and complexity of government,” he wrote, have made it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.” Despite decades of advances in education, technology, and life opportunities, voters now are no better able to guide public policy than they were in Hofstadter’s day, and in many respects, they are even less capable of doing so.
The problem cannot be reduced to politics, class, or geography. Today, campaigns against established knowledge are often led by people who have all the tools they need to know better. For example, the anti-vaccine movement—one of the classic contemporary examples of this phenomenon—has gained its greatest reach among people such as the educated suburbanites in Marin County, outside San Francisco, where at the peak of the craze, in 2012, almost eight percent of parents requested a personal belief exemption from the obligation to vaccinate their children before enrolling them in school. These parents were not medical professionals, but they had just enough education to believe that they could challenge established medical science, and they felt empowered to do so—even at the cost of the health of their own and everybody else’s children.
DON'T KNOW MUCH
Experts can be defined loosely as people who have mastered the specialized skills and bodies of knowledge relevant to a particular occupation and who routinely rely on them in their daily work. Put another way, experts are the people who know considerably more about a given subject than the rest of us, and to whom we usually turn for education or advice on that topic. They don’t know everything, and they’re not always right, but they constitute an authoritative minority whose views on a topic are more likely to be right than those of the public at large.
How do we identify who these experts are? In part, by formal training, education, and professional experience, applied over the course of a career. Teachers, nurses, and plumbers all have to acquire certification of some kind to exercise their skills, as a signal to others that their abilities have been reviewed by their peers and met a basic standard of competence. Credentialism can run amok, and guilds can use it cynically to generate revenue or protect their fiefdoms with unnecessary barriers to entry. But it can also reflect actual learning and professional competence, helping separate real experts from amateurs or charlatans.
Beyond credentials lies talent, an immutable but real quality that creates differences in status even within expert communities. And beyond both lies a mindset, an acceptance of membership in a broader community of specialists devoted to ever-greater understanding of a particular subject. Experts agree to evaluation and correction by other experts. Every professional group and expert community has watchdogs, boards, accreditors, and certification authorities whose job is to police its own members and ensure that they are competent and live up to the standards of their own specialty.
Experts are often wrong, and the good ones among them are the first to admit it—because their own professional disciplines are based not on some ideal of perfect knowledge and competence but on a constant process of identifying errors and correcting them, which ultimately drives intellectual progress. Yet these days, members of the public search for expert errors and revel in finding them—not to improve understanding but rather to give themselves license to disregard all expert advice they don’t like.
Part of the problem is that some people think they’re experts when in fact they’re not. We’ve all been trapped at a party where one of the least informed people in the room holds court, confidently lecturing the other guests with a cascade of banalities and misinformation. This sort of experience isn’t just in your imagination. It’s real, and it’s called “the Dunning-Kruger effect,” after the research psychologists David Dunning and Justin Kruger. The essence of the effect is that the less skilled or competent you are, the more confident you are that you’re actually very good at what you do. The psychologists’ central finding: “Not only do [such people] reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it.” We are moving toward a Google-fueled, Wikipedia-based collapse of any division between professionals and laypeople.
To some extent, this is true of everybody, in the same way that few people are willing to accept that they have a lousy sense of humor or a grating personality. As it turns out, most people rate themselves higher than others would regarding a variety of skills. (Think of the writer Garrison Keillor’s fictional town of Lake Wobegon, where “all the children are above average.”) But it turns out that less competent people overestimate themselves more than others do. As Dunning wrote in 2014,
A whole battery of studies . . . have confirmed that people who don’t know much about a given set of cognitive, technical, or social skills tend to grossly overestimate their prowess and performance, whether it’s grammar, emotional intelligence, logical reasoning, firearm care and safety, debating, or financial knowledge. College students who hand in exams that will earn them Ds and Fs tend to think their efforts will be worthy of far higher grades; low-performing chess players, bridge players, and medical students, and elderly people applying for a renewed driver’s license, similarly overestimate their competence by a long shot.
The reason turns out to be the absence of a quality called “metacognition,” the ability to step back and see your own cognitive processes in perspective. Good singers know when they’ve hit a sour note, good directors know when a scene in a play isn’t working, and intellectually self-aware people know when they’re out of their depth. Their less successful counterparts can’t tell—which can lead to a lot of bad music, boring drama, and maddening conversations. Worse, it’s very hard to educate or inform people who, when in doubt, just make stuff up. The least competent people turn out to be the ones least likely to realize they are wrong and others are right, the most likely to respond to their own ignorance by trying to fake it, and the least able to learn anything.
SURREALITY-BASED COMMUNITY
The problems for democracy posed by the least competent are serious. But even competent and highly intelligent people encounter problems in trying to comprehend complicated issues of public policy with which they are not professionally conversant. Most prominent of those problems is confirmation bias, the tendency to look for information that corroborates what we already believe. Scientists and researchers grapple with this all the time as a professional hazard, which is why, before presenting or publishing their work, they try to make sure their findings are robust and pass a reality check from qualified colleagues without a personal investment in the outcome of the project. This peer-review process is generally invisible to laypeople, however, because the checking and adjustments take place before the final product is released.
Outside the academy, in contrast, arguments and debates usually have no external review or accountability at all. Facts come and go as people find convenient at the moment, making arguments unfalsifiable and intellectual progress impossible. And unfortunately, because common sense is not enough to understand or judge plausible alternative policy options, the gap between informed specialists and uninformed laypeople often gets filled with crude simplifications or conspiracy theories.
Conspiracy theories are attractive to people who have a hard time making sense of a complicated world and little patience for boring, detailed explanations. They are also a way for people to give context and meaning to events that frighten them. Without a coherent explanation for why terrible things happen to innocent people, they would have to accept such occurrences as nothing more than the random cruelty of either an uncaring universe or an incomprehensible deity.
And just as individuals facing grief and confusion look for meaning where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and the awed reasoning behind them, as the Canadian writer Jonathan Kay has noted, become especially seductive “in any society that has suffered an epic, collectively felt trauma.” This is why they spiked in popularity after World War I, the Russian Revolution, the Kennedy assassination, the 9/11 attacks, and other major disasters—and are growing now in response to destabilizing contemporary trends, such as the economic and social dislocations of globalization and persistent terrorism.
At their worst, conspiracy theories can produce a moral panic in which innocent people get hurt. But even when they seem trivial, their prevalence undermines the sort of reasoned interpersonal discourse on which liberal democracy depends. Why? Because by definition, conspiracy theories are unfalsifiable: experts who contradict them demonstrate that they, too, are part of the conspiracy.
The addition of politics, finally, makes things even more complicated. Political beliefs among both laypeople and experts are subject to the same confirmation bias that plagues thinking about other issues. But misguided beliefs about politics and other subjective matters are even harder to shake, because political views are deeply rooted in a person’s self-image and most cherished beliefs. Put another way, what we believe says something important about how we see ourselves, making disconfirmation of such beliefs a wrenching process that our minds stubbornly resist.
As a result, unable to see their own biases, most people simply drive one another crazy arguing rather than accept answers that contradict what they already think about the subject—and shoot the messenger, to boot. A 2015 study by scholars at Ohio State University, for example, tested the reactions of liberals and conservatives to certain kinds of news stories and found that both groups tended to discount scientific theories that contradicted their worldviews. Even more disturbing, the study found that when exposed to scientific research that challenged their views, both liberals and conservatives reacted by doubting the science rather than themselves.