Writing this essay took quite a bit of research, but this is a topic I have wanted to dig into for several years now. I went and read the original 1999 Dunning & Kruger paper as well as several follow up papers. I also went and found some criticisms of the Dunning-Krueger theory.
This essay is not a restatement or regurgitation of the Dunning-Krueger Effect or how it can (supposedly) lead you astray. If you have seen short explainer videos or short infographics of the Dunning-Krueger Effect, and you think you understand it, then this essay will offer you a much better understanding.
This will lead into a broader discussion of experts and advisors.
Preface
I have a general rule: the more people that think that some counter-intuitive idea is true, the more likely it is to be false.
When I was a kid, I was told that daddy longlegs are the most poisonous spiders in the world, but that their mouths are too small to bite humans, so there was no real danger. I heard this over and over.
This is false.
To this day, I am often told that baby snakes are the most dangerous snakes because they “can’t control their venom like adult snakes.”
This is false.
Why does blood look blue when you look at your veins? A lot of people told me, sincerely, that blood was blue until exposed to air and then it immediately turned red.
This is false.
It was posited consistently that a penny dropped from skyscraper would easily kill someone if it landed on them.
This is false.
Several years ago, many people would often say, “we only use 10% of our brains. If we could use our entire brain than we would be nearly superhuman.”
This is false.
I have been told over and over that if you know a little bit about a topic, then you will overestimate how much you know. But the more you learn, the more you realize what you don’t know. Only as you become an expert does your knowledge come into alignment with your confidence about your knowledge. This is known as the Dunning-Kruger Effect (DKE).
This is false.
I can’t remember the first time I learned about the Dunning-Krueger Effect, but the more I heard people talking about it or read what people wrote about it, the more skeptical I became. I would never invoke the Dunning-Krueger argument, and whenever I heard someone else bring it up, I would always say something like “I don’t think we really understand that, and I am not sure you understand it as well as you think you do.”
It is often pointed out that if anyone suffers from the Dunning-Kruger Effect, it’s people who think they understand the Dunning-Kruger Effect. If you think the DKE is solid and irrefutable then I invite you to open your mind and let me introduce some doubt.
Random
First, the technical argument.
Imagine I show you two sets of graphed data. One set of data is Dunning and Krueger’s original data on a graph, and another set of data is randomly generated on a separate graph using the same parameters. If I asked you, “which of these graphs shows the DKE?” what would you say? The answer is that you would not be able to tell which one is the DKE because the two graphs will look the same. This article points out that you cannot distinguish the DKE from randomly generated data. This is essentially because of a concept called “regression to the mean.”
Dunning and Krueger address this problem in their original paper, but their counterargument is to basically just to assert that it’s not just regression to the mean. This is hardly convincing. To demonstrate the existence of something like the DKE, you would have to do it in a way that avoids regression to the mean. I am sure someone has attempted this, but I’ve not seen anything convincing. On top of that, you would need to demonstrate that if that overconfidence exists, it exists in real world scenarios, not on paper and pen tests.
Let’s keep digging.
The 200th hour
When is a pilot most likely to crash? There is a general rule in the aviation world that a pilot is most likely to crash (or make a big mistake) around their 200th hour flying, though some have said that the 500th hour is more dangerous. Why is this? The idea is that once they get to the 200th hour (or whatever hour), their confidence and their knowledge is high, but their experience is not. This is not Dunning-Kruger. There is no initial spike in confidence followed by a trough of thinking they don’t know anything.1 No Army Aviation officer gets into the seat of the Blackhawk after 2 weeks of classroom instruction and says, “I paid attention in class…I am pretty sure I can do backflips in this thing now.” More likely their confidence and their knowledge grow together over time. In fact, it is often the very seasoned pilots who have vast knowledge who are supremely overconfident in their skills.
The guy who crashed that plane was a Lieutenant Colonel with two decades of experience. He was an expert flyer but he was reckless. It wasn’t Dunning-Kruger that killed him, it was his personality.
The point here twofold. First, in dangerous fields that have direct contact with reality, no one in their right mind would be overconfident. If you don’t know anything about plumbing, watching a few youtube videos will not imbue you with the confidence to think you know everything there is to know about plumbing. The same is true of horse-training, or NASCAR pit crew operations.
Second, sometimes, maybe even oftentimes, it is not the ignorance and overconfidence of the laity, but the reckless overconfidence of experts that can cause enormous damage. There is a saying that goes “the strongest swimmers are the most likely to drown.” The logic being that if you can’t swim then you won’t go near water, and if you do, you’ll stay near the shore or wear a life jacket.
The potential danger posed by experts means that experts should always be questioned by the laity and should be happy to explain what they know. The more resistant experts are to skepticism, the more suspicious the laity should be.
Poor Charlie
I recently (re)listened to Charlie Munger’s “24 Standard Causes of Human Misjudgment.” I’ve probably listened to it 15 times or more because it is quite good. About 40 minutes in he starts talking about to deal with different kinds of advisors, all of whom have their own incentives and subconscious psychological tendencies that are driving their behavior.
Munger points out that George Bernard Shaw wrote, “in the last analysis, every profession is a conspiracy against the laity.”
Munger’s point in in quoting Shaw, here, is that people who are experts want you to pay them for their expertise or, more generally, they want to be paid for their expertise. And to get paid they need to seem valuable. And to seem valuable they will make it clear that you cannot do what they can do. Afterall, if you can do what they can do, why would you need them? And, according to them, you might think you know what they know, but you are clearly suffering from a cognitive bias of overconfidence.
And so, the Dunning-Krueger Effect is often used by “experts” as a cudgel to silence “non-experts” so that they can maintain the perception of their value as experts.
But don’t you find it funny that the “experts” who feel the need to invoke the DKE are the experts in fields that are not in positions to get rapid feedback from the environment? When a plumber tells you that he needs to do X,Y, and Z to fix the pipes most people don’t say, “Well ACK-SHEW-UH-LEE I went on google and it said that I can just use duck tape.” But if you do say that the plumber will just say, “okay, have fun with that and please call me back when it doesn’t work.” He won’t launch into a tirade about how you are suffering from a cognitive bias. He doesn't need to convince you; reality is reality with pipes and gravity, and no one does a better job convincing people than cold, hard reality. He knows what needs to be done to fix your leaky pipes, and you can either do it or not. If you want to be a know-it-all, that’s on you, have fun with your leaky pipes. At the same time, knowing a little about plumbing will enable you to ask good questions and get a sense of whether or not you should seek a second opinion. This is a crucial point. It’s not that you know better than the plumber, it’s that you learn just enough to prevent being snowed.
People are not only less likely to be overconfident in fields with direct contact with reality, but for those who are overconfident, it is very easy to demonstrate their ignorance. For example, people who think they would fare well in a physical fight, but who are untrained, can very quickly be shown that they haven’t the slightest chance. Put an untrained fighter against a Jiu Jitsu blackbelt, and it will take about 10 seconds for that person to realize that they have no idea what they are doing.
But imagine experts in fields that don’t come into direct contact with reality, where feedback from the environment is not even close to instantaneous. As Taleb says “it is easier to macro-bullsh*t than micro-bullsh*t.” Taleb gives a whole list of both kinds of professions in The Black Swan, but macro-bullsh*tters are people like geopolitical strategists, (most) leadership coaches, people who talk about quantum-computing (looking at you, Michio Kaku), public health experts (No masks! Wait, masks for all! Also, no milk! Wait, drink milk!) and economic forecasters (a favorite Taleb whipping boy).
So, when you hear someone invoke the DKE as a weapon in a conversation, it is more of a signal about the accuser than about the accused.
Follow Munger’s advice when he says, “Learn the basic elements of your advisor’s trade. You don’t have to learn very much, by the way. Because if you learn just a little, you can make him explain why he’s right.”
An expert shouldn’t accuse you of suffering from Dunning-Kruger. They should be able to explain to you, factually, why they disagree. If they can’t, then I would call their expertise into question. It may be the expert who is overconfident in their field and the lay person (or lesser expert) who has discovered an obvious flaw. ← follow this link to watch a clip from the big short.
Remember, Burry was a medical doctor who had a finance blog. He was a relative newcomer to finance as a hedge fund manager. And, yes, he did know more than Alan Greenspan and Hank Paulson.
Green Lumber
If you’ve never heard of the Green Lumber Fallacy, it comes from Nassim Taleb who recounts a story from a book called, “What I learned losing a million dollars.”
In the story, a guy knows everything there is to know about green lumber (freshly cut lumber). He knew where it is sourced from, how it was made, how many people worked at the lumber mills, how many trucks were dedicated to carrying the lumber, etc. He knew all there was to know about green lumber. He decided to use this unparalleled knowledge to try to make a fortune trading green lumber commodities contracts on a financial exchange. He ended up losing all of this money to a veteran commodities trader who, he later found out, thought that green lumber was lumber that was painted green. In other words, the veteran trader didn’t know anything about green lumber.
The point here is that someone can claim to have expertise in a certain area, but it may not be knowledge that actually matters.
Don’t Overdo It
If you don’t know something, then you don’t know it. No need to pretend. Overconfidence is never good, whether it is coming from a layperson or from an expert.
Tolstoy said, “A man is a fraction whose numerator is what he is and whose denominator is what he thinks of himself. The larger the denominator the smaller the fraction.”
Be humble and know enough to dare to ask stupid questions.
It should be noted that here I am responding to the typical U-Chart that is supposed to demonstrate DKE, not that actual graph that shows the DKE.
This was very interesting and thought-provoking, but I remain unconvinced. For me, the main point of DKE is that when people learn a little about something they often tend to overestimate their expertise and insight. That's certainly true, and I've observed it many times (and been guilty of it as well!) Alexander Pope was riffing on this in the early 18th century, so it's hardly an original idea. Of course experts can be overconfident as well -- human foolishness knows no bounds! But the important point of DKE is not that we should trust experts blindly but that we should be wary of people with limited experience and strong opinions. A few minutes spent listening to any sports radio call-in show will provide you with all the evidence you might want.
And I bet every plumber has loads of stories of homeowner do-it-yourself disasters.
I think you're conflating competence for expertise. DKE looks at the relationship between confidence and competence, not expertise. This makes it consistent with all of your other points here, especially experts invoking DKE to dismiss criticism. This is type of behavior is indicative of incompetence. The issue in wider society is that we have a competence crisis where expertise and competence have diverged as the technocratic managerial elite have maneuvered themselves into positions where reality testing is scarce and achieving accountability when it does occur is nearly impossible.