Discover more from Simplify
Of Course We Should Trust Experts
But who are the experts?
You Trust Experts
In January, The Happy Wanderer published an excellent piece about the problem with listening to experts. I will comment on that piece here and reframe the debate around experts.
I’ll start with the basic question: should we listen to experts? Yes, duh. In fact, almost all our beliefs rely on trusting exerts. Although The Happy Wanderer brought nuance and curiosity to the discourse, I can’t say the same for most discussions on this topic. I might earn more subscribers if I embraced the sophomoric “don’t believe the experts” posture, though I’d need to sacrifice any philosophical integrity to do so. Popular critiques often follow the trend I discussed in my article about lazy arguments. Smart, thoughtful people highlight a meaningful critique (e.g. “correlation isn’t causation”) in a small number of contexts. Less smart, less thoughtful people then misunderstand the critique and repeat it mindlessly.
Let me run through some examples where most of us trust the experts. Since I’ve shoe-horned this into every conversion in the past three months, I’ll start with my volunteering at the local animal shelter. When teaching dogs to properly walk on a leash, I learned to use my legs rather than my feet. If a dog pulls, we provide a verbal cue “uh-uh” and walk backward. This contrasts with the common tactic of pulling on the leash, which just encourages the dog to continue pulling. I also learned how to approach shy and anxious dogs. I try to keep small (as in, move forward on my knees rather than my feet), and greet them from the side. I try to encourage eye contact, as this makes the dogs more likely to be adopted. The best way to teach a behavior, by the way, is to reward the dogs with treats. Finally, I memorized how to interpret body language cues, keep oneself safe if a dog shows aggression, and ways to encourage a dog that seems uninterested in walking.
How did I gain this information? Did I run my own randomized controlled trials on dogs? Did I teach myself neurology and study the canine brain? No. A couple of people at the shelter told me this information, handed me a packet with further details, and sent some links to YouTube videos that visualized a couple of concepts. I thought “seems legit” and accepted it. In other words, I trusted the experts.
Take an even more trivial example. In my last article, I argued that Astro Knights and Nusfjord are easier to set up and explain than the games that inspire them. I imagine that most viewers accepted this. No one demanded raw data showing a statistically significant difference between the setup times that controlled for exogenous factors. This even applies to my articles about linguistics. I’ve received a positive reception from my last two pieces about the history of English. Why, though, did you believe any of the information in those articles? Simply put, you trusted the expert. Yes, I cited my sources, but Substack counts track the number of clicks on each link. Despite my articles earning hundreds of views, I can always count the number of citation clicks on one hand, and, more often than not, on one finger.
We can go deeper: where I did get the information from? I haven’t fact-checked John McWhorter, David Crystal, or Guy Deutscher. I just think they’re experts, so I trust them. Heck, a lot of that information comes from Wiktionary and Etymonline. These sources don’t even pretend to boast a rigorous academic pedigree. I believed them anyway, and, if you liked those articles, it seems that you did as well.
We can go on and on here. Do you like history? Maybe you studied a primary source, but you’re probably just trusting the experts you’ve read or heard in books and videos. How did you learn math? Did you prove the theorems yourself from the first principles? No, you trusted experts. What about a foreign language? Experts. You ever talk to someone online and ask how the weather is in their city? Congrats, you just trusted an expert. Do you ever ask someone for restaurant recommendations? Expert. Experts experts experts.
Of course, we sometimes ignore experts The obvious case for ignoring experts occurs when they contradict each other. If one of my commenters stated that Astro Knights took much longer to set up and Aeon’s End, you’d have to distrust at least one of us. More often, though, the expert we trust is ourselves. If the animal shelter leaders taught me to train dogs by reading poetry to the animals, I would dismiss the advice. That’s because, in some sense, I have my own expertise in dogs. I’ve interacted with enough dogs in my life to know that, if we’re being perfectly honest, these guys aren’t terribly cultured.
The second reason to reject experts is when we have reason to believe that they’re acting in bad faith. I have no doubt that O.J. Simpson knows more about the death of his wife than anyone. I just think he has a bit of a reason to lie about the matter. I’ve also run into this issue when trying to find the number of combat deaths in the Russia-Ukraine war. The Russian and Ukrainian militaries probably have better information than anyone else, but they have even better reasons to misrepresent the facts.
The third and most interesting reason isn’t, in some sense, a rejection of the experts at all. Rather, it’s a skepticism that the self-proclaimed experts actually are experts. Earlier in the article, I denounced lazy rejections like “correlation isn’t causation.” However, I want to avoid committing the same sort of sophomoric philosophy here. I think I’ve made a compelling case against a strong “we shouldn’t listen to experts” posture, but I need to address the more nuances version: “we shouldn’t trust the credentialed.” Here, the discussion gets more fruitful.
The Cynical Perspective
Credentials don’t guarantee expertise. Political scientist Richard Hanania made this case in the context of the Afghanistan War.
The American-led coalition had countless experts with backgrounds pertaining to every part of the mission on their side: people who had done their dissertations on topics like state building, terrorism, military-civilian relations, and gender in the military….
Meanwhile, the Taliban did not have a Western PhD among them. Their leadership was highly selected though. As Ahmed Rashid notes in his book The Taliban, in February 1999, the school that provided the leadership for the movement “had a staggering 15,000 applicants for some 400 new places making it the most popular madrassa in northern Pakistan." Yet they certainly didn’t publish in or read the top political science journals.
The Taliban, Hanania argues, chose their leadership based on a veritable skillset. On our side, meanwhile, the people in charge couldn’t produce much empirical evidence of their mettle. Hanania continues to apply this critique to the rest of academia. You’re probably familiar with the replication crisis, where ridiculous findings (like the Thinking Man statue turning people into atheists) didn’t stand up to repeated scrutiny. If you think of Psychology professors as “experts in Psychology,” this doesn’t make much sense. However, maybe these individuals honed a different skill set. These professors might have gained expertise in how to publish interesting, eye-catching findings in prestigious journals. That’s a difficult skill to master (I’m sure I couldn’t do it), but it’s not the same as being an expert in the human psyche. After all, there’s no God of Human Psychology descending from Mount Olympus to congratulate people for their accurate discoveries. The pay and prestige come from publishing, so that’s the skill that professors need.
Where I differ from Hanania, I imagine, is that I don’t have much faith in the market or private sector to fix these issues. Your company’s SVP of Strategic Operations could be one of the world’s leading experts in strategic operations. More likely, though, is that he’s an expert in politicking himself up the ranks of an organization. One common piece of career advice is that visibility and attitude matter more than productivity. I can’t answer that question about the tree and forest, but I do know that if you produce a ton of valuable work, and never present it to the relevant executives, it absolutely does not make a sound. There’s less overlap between “things that make company money” and “things that get people promoted” than one might expect. The people who reach the upper echelon, thus, have often honed a particular talent for gaining promotions, rather than a talent that matches their literal job descriptions.
That’s just promotions. When it comes to getting a job, one can find no greater cliche than “it’s not what you know, it’s you who know.” Even if you have the connections, interview skills often misalign with job skills. For example, software developers practice brain teaser problems from places like Leet Code. I know plenty of coders, and they’ve often told me that these problems bear little resemblance to their actual workflow.
That said, I want to avoid the temptation of adolescent cynicism. Yes, we see a lot of institutions and individuals that deserve some skepticism. Since I’ve opened my mind a little, bit, though, I’ve found that most people have something to offer on the intellectual front. I’m never doing a power pose, but I imagine that even the professors behind that could offer some meaningful insight about psychology. There is a mismatch between programming interviews and job skills, but most of the people who land these jobs seem pretty competent. Everybody knows something, it just might not align with their official credentials. Unfortunately, this opens an even bigger problem for the notion of expertise.
The Human Perspective
I remember visiting the Van Gogh Exhibit, which surrounds viewers with dynamic versions of his paintings and accompanying music. It was… cool, I guess. But there was one small part that struck me a bit harder than the rest. On a seemingly unrelated note, I found the soundtrack of Paul Thomas Anderson’s Phantom Thread deeply chilling and moving. I don’t usually remember movie soundtracks, so this one must have hit a bit deeper to stand out all these years later. In each case, I stayed through the credits. The Van Gogh Exhibit listed one of the musicians as Thom Yorke, while Johnny Greenwood scored Phantom Thread. That probably explains my reaction: they’re both members of Radiohead.
When I first discovered Radiohead in my teens, I became obsessed. I needed more, so I bombarded Google with “bands like Radiohead” dozens of times. Much to the chagrin of my teenage self, and, to be honest, my adult self, there aren’t any. Shoe gaze? Psychedelic? Math rock? Post-rock? They’re a pale imitation with edges sawn off. I never found any band that elicits the same feelings that they do, and I’ve accepted that I never will.
One might consider Yorke and Greenwood to be “experts in music.” In a role-playing game, the members of Radiohead might receive “10 out of 10 dots” in music or “+5 on all dice-related roles.” That’s not quite through, is it? I’m not sure I could fill an inspiring workout playlist with Radiohead songs, and I don’t think Greenwood should score the next Marvel adventure. Sure, they have some forays into more popular music, but it’s not great. They wrote the “Exit Music” for Romeo and Juliet, and it’s… ok. “Optimistic” and “Nude” aimed for a more popular flair, and they’re pretty forgettable. Yes, they’re good at “music,” but they’re really just good at… “Radiohead music,” as difficult as that is to define.
The Radiohead example simplifies a more general idea: most expertise is specific. From what I’ve heard, professors of biology, physics, and chemistry specialize in a sub-field (if not a sub-sub-sub-field), and don’t keep up with the latest research on the rest of them. I can speak with more confidence in my own field: economics. Professors of microeconomics, macroeconomics, and econometrics usually stay in their lanes, outside the cases when their research requires knowledge of the other sub-fields. As a result, we don’t see many people who can proclaim expertise in biology or economics as a whole.
Other times, people’s expertise seems to defy a single category. Consider a man who earned a Ph.D. in biochemistry, became a tenured professor in the subject, and published several popular science non-fiction books. Most of us would consider such a person to be an expert in biochemistry. Yet, chances are, if you’ve heard of Isaac Asimov, you don’t know much about his biochemistry career. Noam Chomsky is known as the founder of modern linguistics and a left-wing critic of US foreign policy. Wolfgang Pauli earned a Nobel Prize in both Physics and Peace. It’s hard to reduce either man to a single field of expertise.
I can apply this to myself. I’ve worked as a data analyst for several years, but I don’t consider myself an expert in “data.” I’ve seen those lists of “15 things you need to know to become a data analyst.” I don’t know half of them, and neither do the best analysts at your company. I can’t remember much about linear algebra, and I’ve never understood much of the details of some of the more complicated mods. No, no, no, keep that “imposter syndrome” crap out of my face. I am good at something, it’s just not the sort of something that leads to a Google Cloud Certification. What I excel at is, well, this: keeping things simple and getting the basics. Consider the following example of my “expertise.”
At a board game convention, a publisher was demoing one of their games to four players. One player did not understand the game after a couple of explanations. I jumped in and re-framed the game’s concept, after which he figured out what was going on. This left an impression on another player, as he asked me to help re-write the rulebook for his prototype. I did, and I think I did a damn good job at it.
I keep this at the front of my mind when interviewing for roles. My sales pitch is that, if the business picks its top five needs, it can find 20 other people in that resume pile who could fix one of them much better than I ever could. If you need someone who can manage the ins and outs of Collaborative Filtering, don’t waste your time talking to me. Meanwhile, I can put together some B-minus work on all five items and present my B-minuses work in a comprehensible fashion. I’m betting that my five B-minuses will provide more value than everyone else’s mixtures of A-pluses and “Did not completes.” That’s the Klaus promise.
I imagine that all my readers would offer a similar narrative about their own expertise, and that’s my frustration with credentialism. I see too many firms looking for someone who performed pretty much the same role in a different company. The skills that make someone a quality Customer Success Analyst at their old job probably has little to do with some narrow expertise in Customer Success analytics. It’s tough to nail down an individual’s precise areas of knowledge and wisdom. I understand the temptation to reduce this arena to measurables like credentials, titles, experience, and education. We all need maps, I get it. We just need to remember that the map isn’t the territory, and we definitely shouldn’t produce fake maps for the areas we struggle to chart.