The Tyranny of (Alleged) Experts

Why do we believe that the right experts somehow charge forward and demonstrate their superior expertise? There is an alternative to politicians and experts deciding when to reopen the country. Decisions made in the market, by businesses and individuals are naturally cognitively diverse.

Think again if you believe draconian controls recommended by a few (but far from all) medical experts are saving many lives from COVID-19. Facts reported by mathematician Yitzhak Ben Israel of Tel Aviv University donโ€™t support such beliefs. 

Professor Israel found that no matter how much or little politicians quarantined the population, โ€œcoronavirus peaked and subsided in the exact same way.โ€ Whether the country relied on politicians shutting the country down (the US and UK, for example) or private voluntary actions (Sweden), Prof. Israelโ€™s work shows that โ€œall countries experienced seemingly identical coronavirus infection patterns, with the number of infected peaking in the sixth week and rapidly subsiding by the eighth week.โ€ 

In short, coercive measures imposed to protect the public from COVID-19 are as effective as throwing magical โ€œtiger dustโ€ in Central Park to keep tigers at bay in Manhattan. 

Why are we so enamored with experts and their tiger dust? Simply, we donโ€™t understand the inherent fallibility of human beings. Well-meaning experts can be as destructive as authoritarian politicians. 

Nobel laureate Daniel Kahneman is a behavioral economist and psychologist. In his book Thinking, Fast and Slow, he writes, โ€œEvery policy question involves assumptions about human nature, in particular about the choices that people may make and the consequences of their choices for themselves and for society.โ€ Mistaken or unexamined assumptions corrupt decision-making. 

Experts Have Cognitive Biases

In Thinking, Fast and Slow, Kahneman catalogs the many cognitive biases impairing human beings. Kahneman and his late research partner Amos Tversky โ€œdocumented systematic errors in the thinking of normal people, and [they] traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.โ€ In short, โ€œsevere and systematic errorsโ€ in cognition prevent us from being the rational thinkers weโ€™d like to think we are. 

Reporting on the work of psychologist Paul Slovic, Kahneman writes, โ€œ[Slovic] probably knows more about the peculiarities of human judgment of risk than any other individual.โ€ 

Slovic found โ€œan affect heuristicโ€ leads people to โ€œlet their likes and dislikes determine their beliefs about the world.โ€ If you are biased towards strong government action to combat the coronavirus, you will believe the โ€œbenefits are substantial and its costs more manageable than the costs of alternatives,โ€ such as relying more on voluntary adjustments by businesses and the public. 

Kahneman writes, โ€œThe affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).โ€

Media coverage of the impact of the coronavirus is โ€œbiased towards novelty and poignancy.โ€ The resulting emotional reaction shapes our estimates of risk; many fear the consequences if draconian actions are not imposed by government.

If youโ€™re thinking, thatโ€™s why we need to entrust these decisions to experts, you would be wrong: Experts have the same cognitive biases as the rest of us.

Because of expert bias, Slovic โ€œstrongly resists the view that the experts should ruleโ€ and dissuades us from believing โ€œtheir opinions should be accepted without question when they conflict with the opinions and wishes of other citizens.โ€

Kahneman is clear about the implications of Slovicโ€™s work. The public and expert policy-makers often have conflicting values. The public distinguishes between types of deaths, such as a 90-year-old-man with a heart condition and a 30-year-old-mother in good health. Nuances are lost in aggregate statistics.

Kahneman concludes, โ€œSlovic argues from such observations that the public has a richer conception of risks than the experts do.โ€ 

Risk is not objective. Kahneman writes, โ€œIn his desire to wrest sole control of risk policy from experts, Slovic has challenged the foundation of their expertise. Kahneman quotes Slovic: 

โ€œRiskโ€ does not exist โ€œout there,โ€ independent of our minds and culture, waiting to be measured. Human beings have invented the concept of โ€œriskโ€ to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as โ€œreal riskโ€ or โ€œobjective risk.โ€

Our evaluation of risk โ€œmay have been guided by a preference for one outcome or another.โ€ โ€œDefining risk is thus an exercise in power,โ€ explains Slovik.

In other words, no matter how sincere the experts are, their preference for decisive action on the part of the government will warp their conception of risk and guide their policy recommendations. 

Donโ€™t be fooled by the appearance of confidence on the part of the experts. Their confidence is no reason to trust them. Kahneman warns: โ€œOverconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.โ€

The Arrogant Canโ€™t Solve Problems

 โ€œOur ignorance is sobering and boundless,โ€ philosopher Karl Popper famously observed.

The world is full of challenging problems, such as the coronavirus, and individuals have boundless ignorance. It is not surprising that Popper believed, โ€œThere are no ultimate sources of knowledge.โ€ We can only โ€œhope to detect and eliminate error.โ€ We detect and eliminate error by allowing criticism of the theories of others and our own.

Popper provided us what could be a credo for humble individuals willing to admit the limits of individual knowledge: 

With each step forward, with each problem which we solve, we not only discover new and unsolved problems, but we also discover that where we believed that we were standing on firm and safe ground, all things are, in truth, insecure and in a state of flux.

The arrogant canโ€™t solve problems because they are blind to their limits. Decision-makers, consumed by hubris, ignorant of their limited understanding, do not tap into knowledge held by others.  

In truth, each of us knows very little. As individuals, we are fallible decision-makers. We donโ€™t know where solutions will emerge, or as economist William Easterly instructs us in his book, The Tyranny of Experts, โ€œwhat will be the solution,โ€ or โ€œwho will have the solution.โ€ 

Cognitive Diversity

In his book The Wisdom of Crowds, James Surowiecki explains โ€œthereโ€™s no real evidence that one can become expert in something as broad as โ€˜decision-makingโ€™ or โ€˜policy.โ€™โ€

Among scientists who face a โ€œtorrent of informationโ€ each day, Surowiecki points out, โ€œreverence for the well-known tends to be accompanied by a disdain for the not so well known.โ€ 

Surowiecki is not arguing โ€œthat reputation should be irrelevant,โ€ yet reputation โ€œshould not become the basis of a scientific hierarchy.โ€ Instead, a โ€œresolute commitment to meritocracyโ€ fuels discovery. In the current crisis, where dissenting voices are being shut out, it is hard to see how commitment to meritocracy is being upheld.

Surowiecki points to โ€œcognitive diversityโ€ as a key to forming teams that are more than the sum of their members. Surowiecki has counterintuitive conclusions for those who believe in decision making by elite experts:

If you can assemble a diverse group of people who possess varying degrees of knowledge and insight, youโ€™re better off entrusting it with major decisions rather than leaving them in the hands of one or two people, no matter how smart these people are.

Surowiecki warns of groupthink that occurs when in groups, we emphasize โ€œconsensus over dissent.โ€ 

Examining decision-making at the National Aeronautics and Space Administration (NASA) during the space shuttle Columbia catastrophe, Surowiecki found an abject lesson: โ€œInstead of making people wiser, being in a group can actually make them dumber.โ€ 

In that crisis, โ€œteam members were urged on many different occasions to collect the information they needed to make a reasonable estimate of the shuttleโ€™s safety.โ€

The evidence was there of the potential consequences of the debris strike on the Columbia shuttle. 

Yet, during the group meeting after debris struck Columbia, there was โ€œthe utter absence of debate and minority opinions.โ€

The team could have made โ€œdifferent choices that could have dramatically improved the chances of the crew surviving,โ€ yet they โ€œnever came close to making the right decision on what to do about the Columbia.โ€

Why? โ€œFirst, the team started not with an open mind but from the assumption that the question of whether a foam strike could seriously damage the shuttle had already been answered.โ€ Surowiecki explains:

Rather than begin with the evidence and work toward a conclusion, the team members worked in the opposite direction. More egregiously, their skepticism about the possibility that something might really be wrong made them dismiss the need to gather more information.

Today, during the coronavirus crisis, decision-makers have begun with the conclusion that the human consequences to the economy are less critical than their preferred methods to contain the virus. 

In the case of Columbia, the โ€œconviction that nothing was wrong limited discussion and made them discount evidence to the contrary.โ€ All of us, including experts, are subject to confirmation bias, โ€œwhich causes decision-makers to unconsciously seek those bits of information that confirm their underlying intuitions.โ€

In policy-making teams, Surowiecki writes, โ€œthe evidence suggests that the order in which people speak has a profound effect on the course of a discussion.โ€ He adds, โ€œEarlier comments are more influential, and they tend to provide a framework within which the discussion occurs. As in an information cascade, once that framework is in place, itโ€™s difficult for a dissenter to break it down.โ€ 

Dr. Anthony Fauci, Dr. Deborah Birx, and others recommending policy are fallible human beings. Like us, these experts are not readily able to see what they do not know. Kahneman and Slovic would tell us they are subject to the same cognitive biases as other human beings. They and other members of their team are not exempt from human frailties, such as the desire for power. They tend to be overconfident. Their expertise is likely โ€œspectacularly narrow.โ€ 

Surowiecki asks us to consider why we โ€œcling so tightlyโ€ to the false belief โ€œthat the right expert will save us.โ€ Why do we believe that the right experts somehow charge forward and demonstrate their superior expertise? Surowiecki points us to a series of studies by those who found โ€œexpertsโ€™ judgments to be neither consistent with the judgments of other experts in the field nor internally consistent.โ€

Looking only in one direction, groupthink is the result. Surowiecki cautions that groupthink โ€œworks not so much by censoring dissent as by making dissent seem somehow improbable.โ€ 

There is an alternative to politicians and experts deciding when to reopen the country. Decisions made in the market, by businesses and individuals are naturally cognitively diverse. Provided with uncensored information, when free to utilize their knowledge and wisdom, they will make better decisions than the experts and politicians.



Post on Facebook


Post on X


Print Article