We Need Better Experts- and a Better Public

Experts are often wrong. Non-experts are even more often wrong.

These two statements are both true and important but it is hard to keep both of them in mind at the same time. Experts and members of the public both misunderstand this, and each ‘side’ can err by being either too humble or too arrogant:

ExpertFailureModes

COVID has provided many examples of all 4 failures. The expert institutions that were supposed to handle this best, the WHO and CDC, kept being arrogantly wrong thoughout the first quarter of 2020- saying that there was no human-to-human transmission, that travel bans were unnecessary, that masks don’t work, and in the case of CDC producing a botched test kit while preventing the use of alternative tests. Throughout this time, much of the public, and even experts outside the institutions, humbly deferred.

At the same time, many people who had figured out what was really happening were too humble to speak out, or too scared, or simply didn’t think of it. In the second quarter of 2020, expert institutions gradually figured things out, but much of the public still arrogantly dismisses their advice.

There are a few reasons this isn’t easy to get right.

Generalizability- Sure, in early 2020 on COVID you would have been better off listening to the Bay Area technologists I follow on Twitter than to the WHO and CDC- but is that generally the best way to get health advice? How many people are capable of listening to both groups and evaluating their specific arguments to figure out who is right?

Bias- experts really are biased, but it isn’t always obvious in which directions this matters, and members of the public have their own biases. For instance, are the public health agencies overstating the risk of COVID in order to get more funding and power, or understating it so as not to embarrass the governments that fund them? Both sounds plausible in the abstract, and you could say either in order to justify your own biases. In this case I do think they understated risks due to political pressure, but its also possible they simply weren’t as smart as we, or they, thought they were.

Expert, compared to who?- For quasi-experts, its important to keep in mind what your audience knows and where their information will come from if not from you. Many economists get annoyed and don’t answer when regular people ask them about the stock market- “I’m not an expert, that’s not what economics is really about”- but I think this is a bad response. We’re not true experts, but we should know much more than the average person, and be glad for the opportunity to share the basics (save through your 401(k) and put it in low-fee diversified index funds) and point them to the true experts. On the other hand, if you think your quasi-expertise means you can beat the market day-trading you’re likely to have a bad time.

How can members of the public know who are the real relevant experts? This is often far from obvious. Should I trust epidemiologists at the CDC more or less than those at universities? When evaluating potential treatments should I most trust virologists, epidemiologists, medical doctors, or someone else? If I want to know how COVID will affect the economy should I ask epidmiologists, economists, or someone else? If economists, who or which subfield? For forecasting COVID cases, is the relevant expertise domain knowledge like epidemiology or is it general forecasting ability?

Getting experts to be less biased and to have the appropriate level of confidence in their abilities and predictions is vital. So is getting members of the public to know who the relevant experts are, and how much to defer to their judgement in various situations- yet none of this is explicitly taught.

A parting thought, paraphrasing Garett Jones‘ twist on an old William Buckley quip- “I’d rather be governed by the first 2,000 names in the Boston phone book than by the faculty of Harvard, but I’d rather be governed by the faculty of MIT than either”

Related reading:

Inadequate Equilibria, Eliezer Yudkowsy is all about this issue: “the single most obvious notion that correct contrarians grasp, and that people who have vastly overestimated their own competence don’t realize: It takes far less work to identify the correct expert in a pre-existing dispute between experts, than to make an original contribution to any field that is remotely healthy.”

“inside a civilization that is often tremendously broken on a systemic level, finding a contrarian expert seeming to shine against an untrustworthy background is nowhere remotely near as difficult as becoming that expert yourself. It’s the difference between picking which of four runners is most likely to win a fifty-kilometer race, and winning a fifty-kilometer race yourself. Distinguishing a correct contrarian isn’t easy in absolute terms. You are still trying to be better than the mainstream in deciding who to trust. For many people, yes, an attempt to identify contrarian experts ends with them trusting faith healers over traditional medicine. But it’s still in the range of things that amateurs can do with a reasonable effort, if they’ve picked up on unusually good epistemology from one source or another.”

Myth of the Rational Voter, Brian Caplan: Argues that the public is rationally ignorant of issues (like public policy and politics) where more knowledge would not personally benefit them. But also shows that experts have systematically different political views than the general public, and showcases a statistical method to “correct” for this and show  what economists would think on various policy issues if their income level and general political views matched that of a typical person.

1 Comment

Leave a Comment