Based on my life experiences up to this point, it feels evident to me that there is very little epistemically justifiable value in trusting a thought-provoking viewpoint or piece of advice purely because it is stated by an expert. Of course, expert opinions are valuable, I’m simply saying that if you hear a bold claim that you have very little background knowledge to reason about, it is a bad idea to be swayed by expert status alone.
Context matters; most math experts can probably be trusted to give accurate facts about elementary school math. The intention of this post therefore is to highlight the nature of potential overconfidence in expert opinion in contexts where that confidence is likely to be misplaced. For example, I believe that life advice, predictions about the future, and certain subject areas such as soft sciences tend to be areas where experts frequently express wrong conclusions. Not only do they come to the wrong conclusions, you’re disproportionately more likely to be hearing from a mistaken expert than a sound expert relative to the expert population. Perhaps this is a more accurate take on the post title: in your lifetime you will hear many wrong expert opinions (even in areas where wrong expert opinions are a minority).
A practical proof that wrong expert opinions are common: Google any binary question that has significant political, cultural, financial, or emotional implications, and you can almost always find at least one expert supporting each side, even if there is an overwhelming scientific consensus. E.g. “Are human activities the primary driver of global warming?”
Why we shouldn’t expect experts to come to correct conclusions
I find it odd to be arguing this point, because experts make mistakes (as well as being incentivized to say things which aren’t completely true) as all humans do, so there isn’t any inherent logical reason to believe that a population of experts should necessarily exceed any specific threshold of reliability that we would think of as high. I argue that this is true for any colloquially sensible definition of expert, that any pool of experts under any sensible criteria is subject to general human fallibility, and that the assumption many people hold about experts being trustworthy has very little causal basis.
However, instead of just claiming absence of evidence for expert reliability, I’ll highlight some key reasons why the ways in which experts differ from non-experts sometimes does not increase reliability at all.
Examples of experts who are frequently wrong about multiple ideas within their public communications
Richard Dawkins, Michio Kaku, Neil deGrasse Tyson, David Buss, Jordan Peterson, and Andrew Huberman. These are famous influencers who talk about science and are respected in the public eye, while their reputation among fellow academics in their fields is quite suspect. This happened within a study that randomly sampled scientists in the UK: “Although the researchers did not ask questions about Dawkins, 48 scientists mentioned him during in-depth interviews without prompting, and nearly 80 percent of those scientists believe that he misrepresents science and scientists in his books and public engagements.”
Can my thesis be disputed by claiming that these academics are not “true experts”? I think the answer is no, and I have three counterarguments against denying that these are examples of experts:
If we deny their expert status on the basis that they express ideas that are wrong, that implies we need absolute knowledge about which ideas are wrong, which no one has, and “experts are people who are right most of the time” uses an unknowable criteria which renders it arbitrarily subjective in the eye of the beholder. If we exclude on the basis of expressing ideas that are “known” to be disproven, this still does not resolve the subjectivity issue, and we are no longer talking about experts, we’re talking about experts who don’t believe in star signs AND flat earth AND some other arbitrarily and subjectively picked list of filters. But people can be experts in pseudoscientific fields, and I propose that any attempted definition which excludes that possibility would have even less defensible implications.
Another issue is that if none of these people are experts, then we inevitably end up excluding other people who are legitimate experts using the same disqualifying factors. For example, perhaps it is completely reasonable for an epigenetics or nutritional science expert to reach a lot of wrong conclusions, and because of how tentative and evolving the field is, in 20 years we might look back on their earlier work and say, all of the novel theories they believed were wrong, but they had valuable reasons for proposing those hypotheses.
My last counterargument is that, on a functional level, expertise can be recognized but is harder to “un-recognize”. For example, name someone who you think does qualify as an expert, now imagine that they started misrepresenting science for personal financial gain, or they develop new ideas that are clearly wrong while still being reliable in their previous knowledge. This happens to people, and unless their memory and fundamental skills get wiped, they already met the bar for expertise, and adding bad opinions on top of that doesn’t “cancel” the fact of their existing expertise. At best, all you can do is call them a “bad” or “irrelevant” expert, but an expert nonetheless.
Expertise from knowledge, education, and experience
A possible definition of an expert from ChatGPT: Someone who has comprehensive and authoritative knowledge or skill in a particular area, usually gained through significant experience, education, and practice. Expertise is characterized by deep understanding, the ability to apply knowledge in complex or novel situations, and the capacity to solve problems that others cannot.
Some people are experts on the basis of having a ton of knowledge, education, and experience in their topic. I argue that one can satisfy these three criteria and still be wildly misled in their professional opinions. I’d be surprised if you haven’t heard from experts like that. My case here is that all human beings are susceptible to cognitive biases, and if you take a person with strong prejudices and inject them with decades of knowledge, education, and experience, there’s no inherently global systematic process that would necessarily facilitate them in shedding their crude biases. These processes 1) don’t exist and 2) aren’t effective when they do exist and 3) can easily be avoided or ignored throughout one’s professional career.
Smart/educated people do NOT naturally reach better conclusions
When people say I must be smart because I did a PhD, I often say “The reason that people don’t do PhDs isn’t because they’re not smart enough, it’s because you have to commit 4+ years of your life to it.” Even if all PhD graduates were highly intelligent, the current body of research does not support the idea that intelligence protects against cognitive biases or even that trying to be mindful of cognitive biases reduces it. What about peer review and the PhD examination process? If these are the only systematic checks in place that actually exist, they are processes that only weed out the worst of the worst. When you go through these, ultimately the reviewers want to be able to accept your work. And even if you get reviewed by experts in the field, they will often lack the competence or motivation to thoroughly assess your work, because of course an expert in one subtopic in a large field has way more that they don’t know in the field than things they do know. Two experts can easily have very little overlapping knowledge/competence. Another example is how, if you made statistical errors or simply made up data in your research, who is going to check it and find out? Not every academic expert has passable competence in statistics, let alone the specific statistical framework you used, and even if they did, most people cannot be bothered to check.
In summary, it can and does happen that ordinary people with preconceived flawed beliefs pass through decades of knowledge, education, and experience, attaining expertise while holding onto their beliefs, even in the face of opposing material. This is just something that you and I and all humans do to some extent: protect our beliefs from opposing views.
Being an expert doesn’t require critical thinking
Another angle worth mentioning is that there are different types of expertise. Some experts are only technically competent at regurgitating information and then try to use those facts to argue an idea even when the facts don’t support it. Other experts have shaky knowledge yet can solve complex problems that most academics can’t. Some experts literally do reverse science, they look for the world view they want to support and then find evidence for it while discarding anything contrary. In various contexts where no one has definitive answers (e.g. within nutritional science, evolutionary psychology, neuroscience), this means that no one can disprove these experts even if their claims are completely made up with no supporting evidence.
When we think of traditional experts, maybe we think of them coming from reputable institutions. But here lie two pitfalls: institutional knowledge is often just a set of beliefs that were passed on systematically, regardless of whether those beliefs have current supporting evidence in favor of them. This is rampant in medicine and education, and this post provides some captivating examples: Of Experts and Echo Chambers. The bar for being an expert is often just knowledge, not correct knowledge, and not epistemically sound ways of interpreting knowledge or forming conclusions/predictions from them. There aren’t really any social standards for the latter. Epistemic awareness is unpopular.
Epistemically aware experts
I think a lot of positive examples of experts do exist, but such people are naturally less visible than experts who are influencers making a name for themselves (where being controversial or overconfident is often rewarded). I have an example noted somewhere that I cannot find, but I’d be interested to hear your examples. It was a podcast episode where an expert was being interviewed about their subject. One litmus test when it comes to publicly visible experts is how many subjects they claim to have prescriptive knowledge of. For example, when someone spends 4 years working on a PhD topic, what they gain is an epistemically comprehensive understanding of an extremely narrow topic, plus a wider awareness of closely related topics (mostly “knowing of” related research) with relatively low epistemic confidence. Within their extremely narrow topic, they can be forgiven for making absolute claims, but ask a seemingly related but distinct question, and suddenly the epistemically aware response resembles something like “I know of that topic and opinions that exist on it, but since I haven’t personally surveyed dozens of papers on that exact question, I cannot really express any impression I may have with any scientific conviction.”
In an interview being broadcast to a lot of people, some of whom may be tempted to take advice even based on a “vibe”, it is smart to simply say “That’s not my niche; I don’t feel qualified to answer that here. All I can do is be clear about what things I know and what things I don’t.” But back to the litmus test, basically if an expert is giving authoritative advice on a regular basis on numerous distinct niches (or worse yet, multiple fields), they are almost certainly overrepresenting the breadth and confidence of their expertise. It takes a very long time to gain narrow scientific expertise in a robust sense. While someone can gain narrow expertise in multiple scientific fields, the amount that anyone can say with scientific conviction isn’t going to span hundreds of podcast episodes, it’s likely to be more like 1-2 episodes (or lots of them saying almost the same thing each time) before anything additional that’s expressed is just informed armchair opinion not within their narrow expertise. Could it be that some science influencers really are so prolifically productive or gifted such that they really do have mastery in so many areas? I think the answer is no, because the amount of knowledge/experience they are posturing themselves as having is often realistically equivalent to thousands of lifetimes compared to the rate that epistemically aware experts gain confident knowledge.
Status from social recognition can be easily gamed
In practice, the status of being recognized as an expert is heavily based on social impression. (If there’s an expert in the forest and nobody knows they are, are they really an expert?) You can try to pick whatever definition of expert you want, but there are a lot of ways to be incompetent in a field and still be regarded as an expert in society by:
Simply calling yourself an expert until people believe you
Appearing publicly on a popular TV show, news channel, podcast, etc
Being an influencer
Writing a best-selling book
Starting a business or cult
Even if such people are not genuine experts, the point stands that bad actors have a better shot at being heard, because fame and reputation favors charm, overconfidence, controversial opinions, narrative, political justification, and so on. Being epistemically cautious isn’t going to land you massive fame.
As a side note, it’s easy to distort the truth in certain contexts/media where the audience is likely to be more trusting. Take science documentaries for example, I don’t naturally start watching a documentary thinking that there are going to be multiple deliberate misrepresentations and that I have to try and track which statements might be coming from which sources, and what the epistemic status is on each idea. But it’s actually quite common that documentaries have multiple errors or sometimes the whole thing is propaganda. In fact my BS radar does go off in many situations, especially where narration leaps from “Fact X” to “We know that Interpretation Y” without any bridging evidence. Often because Fact X is a recent discovery for which the number of corroborating studies on it is minimal, it means that there are likely no existing studies verifying that Interpretation Y is the only possible interpretation. Often it isn’t even possible to design a study that would robustly confirm Interpretation Y over alternative hypotheses.
For example, Sir David Attenborough once polled as the most trusted man in Britain. You wouldn’t be alone in instinctively placing trust in what he narrates, despite the fact that he doesn’t write the scripts he’s given, he doesn’t have to be an expert in the content, and you probably don’t know him on a close enough level to accurately gauge his relationship with epistemics. And David has in fact been accused multiple times regarding his participation in documentaries that turned out to have serious flaws.
Closing note
In summary, there are many everyday contexts where you might be exposed to expert opinions, and in some of those contexts, truthiness is at a heavy disadvantage relative to other factors such as popularity, social appeal, controversy, and motivated reasoning. I’ve argued that such contexts arise frequently because common paths to expertise (decades of learning or education or experience) often do not have any effective ways of “course correcting” against critical biases that any ordinary person may hold and may still carry in spite of becoming an expert and being exposed to opposing views.