13 Comments
User's avatar
Michelle Gálvez's avatar

Hi, I’m actually writing my thesis on the Dunning-Kruger effect, and I found your article very interesting. It’s a very controversial topic (everyone has a different opinion on it), so it’s hard to assess where the effect is coming from. Some say it’s a metacognitive problem, others say it’s a narcissistic response, and there are many other possible roots it could be linked to. And (as with almost every topic), the experts disagree. But I thought I’d share this for anyone wanting to learn more about it.

The “regression toward the mean” (RTM) effect is a natural movement of data, in which a case that is closer to an extreme in one measurement tends to move closer to the mean in a second measurement. Imagine a roof and a floor: if you’re already near the ceiling in one measurement, the chances of going even higher are small, while there’s more room to go down. The same logic applies to people closer to the floor.

So, the authors argued in their first paper that yes, this is indeed a phenomenon that clearly affects the data, but it cannot explain everything. They reasoned that if RTM were the only factor influencing the data, then making people more competent in the skill or topic and asking them how they thought they performed on a test wouldn’t necessarily change their predictions. However, if increasing their competence does change how they evaluate their performance, then something else must be at play. And indeed, when people actually learned the skill tested, they could assess their past performances more accurately than those who remained incompetent. This was among other arguments for why it’s not just about RTM.

Of course, other papers have shown different results, and it’s still up for debate whether the Dunning-Kruger effect is a real phenomenon. But nothing exists in a vacuum—there’s always the context in which we’re analyzing a subject.

Love this topic! Greetings.

Expand full comment
Austin Caroe's avatar

Thanks for the thoughtful response!

I think there is still a lot more room for research here. For example, if you were to have the study participants rank themselves *relative to other participants* I think you would likely see different results. I also think that if you have participants *describe* their performance by asking questions like, "what did you like about this test?" "Are there any questions you struggled with?" There is a lot of room for research here. The problem is that pen and paper tests do not sufficiently mimic behavior in the real world. Gerd Gigerenzer has written about that extensively.

Thanks again!

Expand full comment
Theodore Whitfield's avatar

This was very interesting and thought-provoking, but I remain unconvinced. For me, the main point of DKE is that when people learn a little about something they often tend to overestimate their expertise and insight. That's certainly true, and I've observed it many times (and been guilty of it as well!) Alexander Pope was riffing on this in the early 18th century, so it's hardly an original idea. Of course experts can be overconfident as well -- human foolishness knows no bounds! But the important point of DKE is not that we should trust experts blindly but that we should be wary of people with limited experience and strong opinions. A few minutes spent listening to any sports radio call-in show will provide you with all the evidence you might want.

And I bet every plumber has loads of stories of homeowner do-it-yourself disasters.

Expand full comment
Austin Caroe's avatar

I made a number of claims in the essay, which were you unconvinced by?

The main claims are four-fold.

1. The original DKE research is flawed and doesn’t show what people think it shows. (https://theconversation.com/debunking-the-dunning-kruger-effect-the-least-skilled-people-know-how-much-they-dont-know-but-everyone-thinks-they-are-better-than-average-195527)

2. Experts in domains with little feedback are more dangerous than lay-persons who are overconfident in that same domain. (DKE, if real, isn’t the real problem).

3. Experts who use DKE as a weapon deserve *more* scrutiny. I agree plumbers see people do stupid stuff! But they don’t go around accusing people of suffering from DKE! They just fix the problem.

4. You should not be afraid to learn a little bit about a subject so that you can ask stupid questions.

As far as the actual academic side of DKE, check out this video: https://youtu.be/pWiQzKBSvE0?si=khnNQb0wkpdpWVeg

Expand full comment
Grant Smith's avatar

I think you're conflating competence for expertise. DKE looks at the relationship between confidence and competence, not expertise. This makes it consistent with all of your other points here, especially experts invoking DKE to dismiss criticism. This is type of behavior is indicative of incompetence. The issue in wider society is that we have a competence crisis where expertise and competence have diverged as the technocratic managerial elite have maneuvered themselves into positions where reality testing is scarce and achieving accountability when it does occur is nearly impossible.

Expand full comment
Steve the sailor's avatar

Very good, but missed too much on the first go. I will listen again when I am done putting the Salomon up in brine!

Expand full comment
Jason's avatar

I never thought to even question the Dunning-Krueger Effect and its...effects. thanks!

Expand full comment
Austin Caroe's avatar

Thanks for reading!

Expand full comment
Eugine Nier's avatar

Wasn't the original Dunning-Krueger study notorious for using highly subjective things like which jokes were funnier for their questions?

Expand full comment
Austin Caroe's avatar

That was one of several assessments that they did. The others were a test of English grammar and a test of logical reasoning. The flaws are less with the assessments themselves and more about how they interpreted the self-assermentas by comparison. One thing I didn’t include in the essay which I wish I had was the most obvious flaw: people gravitate towards 7 on 10 point scales or 70 on 100 point scales because it’s non-communal. If I ask you how you think you did on something on a scale of 1-10, saying higher than 7 is braggy and saying less than 7 is probably too humble. So it’s no surprise that in the self-assessments people have themselves scores that hovered around 70.

Expand full comment
Michael Woudenberg's avatar

I think the focus on DK is half-right. There are folks who greatly underestimate their competence in matters without knowlege. Where it's half wrong, which is what you're poking at, is that Experts, speaking outside their domain (green lumber) suffer DK much MORE than the average bloke, especially if their being wrong has no direct consequence (as Thomas Sowell talks about a lot). Bottom-line, we can all suffer from hubris if we don't embrace humility.

Expand full comment
Aanya Dawkins's avatar

Great post and reminder about confidence.

Expand full comment
Diana Compton's avatar

So what do we call people who are supremely confident in their expertise but who are actually incompetent. Those are the folks I want to stay away from.

Expand full comment