...the who's who,
and the what's what 
of the space industry

Space Careers

news Space News

Search News Archive

Title

Article text

Keyword

  • Home
  • News
  • The people who admit what they don’t know aren’t being modest. They’ve crossed a threshold of competence that most people never reach.

The people who admit what they don’t know aren’t being modest. They’ve crossed a threshold of competence that most people never reach.

Written by  Marcus Rivera Friday, 10 April 2026 04:07
The people who admit what they don't know aren't being modest. They've crossed a threshold of competence that most people never reach.

The popular Dunning-Kruger narrative says incompetent people can't recognize their incompetence. But recent research suggests the real story is different: nearly everyone overestimates themselves, and the rare people who accurately admit what they don't know have crossed a metacognitive threshold that correlates with genuine competence.

The post The people who admit what they don’t know aren’t being modest. They’ve crossed a threshold of competence that most people never reach. appeared first on Space Daily.

When someone in a meeting admits they don’t know enough about a topic to have a strong opinion, the room goes quiet for a beat, and then the conversation moves on as if nothing important just happened, even though what that person demonstrated is a form of competence that years of psychological research suggest is far rarer and more valuable than most organizations realize.

person admitting uncertainty

The Dunning-Kruger Story You Were Told

The popular version goes like this: stupid people don’t know they’re stupid. It’s clean, funny, and satisfying. It became one of psychology’s most shareable ideas. It spread through TED talks, Twitter threads, and workplace culture. It gave everyone a framework for dismissing people they disagreed with.

The original 1999 paper by David Dunning and Justin Kruger at Cornell University tested undergraduates on logic, then asked them to estimate both their raw scores and how they performed relative to peers. The concept that emerged was that people in the bottom quartile dramatically overestimated their abilities, while skilled people slightly underestimated theirs. Incompetence, the argument went, robs you of the very tools you’d need to recognize your incompetence.

That story became cultural shorthand. It was deployed to explain everything from bad management to election outcomes. But the version of it most people carry around in their heads may be fundamentally wrong.

What the Math Actually Shows

Eric C. Gaze, a math professor, and his colleagues published work arguing that the Dunning-Kruger effect is largely a statistical artifact rather than a real cognitive phenomenon. Their argument is specific and worth understanding, because it changes the implications of the research considerably.

Here’s the core issue. In the original study, both the lowest-scoring and highest-scoring groups estimated similar performance levels, but the bottom group actually scored significantly lower while the top group scored significantly higher. That’s an overestimate for the low scorers and an underestimate for the high scorers. Not great, but not catastrophic either. Both groups were somewhat wrong about their raw performance.

Where the dramatic effect shows up is in the peer-comparison measure. The lowest-scoring students estimated they did better than most test-takers, when by definition they outperformed only a small percentage on average. That gap looks devastating.

But Gaze and his colleagues demonstrated that this gap is mathematically inevitable. They generated fictional people with randomly assigned test scores and randomly assigned self-assessments. When they applied Dunning and Kruger’s same analytical protocol, the “effect” appeared in the random data too. The bottom quartile showed substantial overestimation with zero human cognition involved. The research design itself produces the pattern.

The real finding, Gaze argues, is simpler and less dramatic: most people think they’re better than average. That’s the better-than-average effect, and it applies across the skill spectrum. Studies consistently show that the vast majority of people rate themselves above their peers, whether as drivers or in professional settings. It’s a universal human tendency, not something unique to the incompetent.

So What Actually Separates the People Who Know?

If the popular Dunning-Kruger narrative overestimates how clueless low performers are, and if the real story is that nearly everyone overestimates themselves, then the interesting question flips. It’s not “why are unskilled people so unaware?” It’s “what’s different about the people who break through the universal bias?”

Research has shown that among students who scored in the bottom quarter on assessments, the majority were fairly accurate at estimating their real ability. Only a small minority significantly overestimated themselves. The unskilled-and-unaware archetype exists, but it describes outliers, not the default state of less-skilled people. This reframing matters because it points to something the pop-culture version of Dunning-Kruger obscures: the ability to admit uncertainty isn’t just humility. It’s a specific cognitive skill called metacognition, the capacity to think about your own thinking, to monitor the quality of your reasoning in real time.

And while the popular narrative treats poor metacognition as a symptom of being dumb, it’s actually independent of general intelligence. Smart people fall into the trap too, especially when they wander into domains where their intelligence doesn’t translate into actual knowledge. Being smart and being knowledgeable are not the same thing. Someone can be brilliant at systems thinking and completely wrong about immunology. I know this from personal experience: my knowledge runs deep on institutional politics, budget mechanics, and how policy gets made in Washington. It doesn’t extend to spacecraft engineering or hard physics, and I’ve learned that being clear about that boundary is more useful than pretending it doesn’t exist.

This intelligence-knowledge distinction is where a specific population gets hit especially hard: people who grew up being identified as gifted or intellectually advanced. If your identity is built on being the smartest person in the room, admitting ignorance feels existentially threatening. It’s not just a knowledge gap. It’s an identity crisis. So the former smart kid fakes it. They project confidence about topics they only partially understand. They offer opinions on subjects they haven’t studied because silence feels like failure.

The real risk isn’t that low-skilled people are too dumb to know they’re incompetent. The real risk is that everyone, smart and otherwise, defaults to overconfidence, and that our social systems reward that overconfidence instead of punishing it.

professional team meeting discussion

Why Admitting Ignorance Is a Competence Signal

The conventional wisdom treats confidence as the primary signal of competence. We promote confident people. We hire confident people. We listen to whoever sounds most certain in a meeting, regardless of whether their certainty is earned.

But the research points in a different direction. If the universal tendency is to think we’re better than average, then the person who breaks that pattern by admitting uncertainty or acknowledging the limits of their expertise is doing something cognitively unusual. They’re overriding a default setting.

This is the threshold the title refers to. It’s not modesty. Modesty is a social performance, a way of managing how others perceive you. What I’m describing is a functional capability: the ability to monitor your own cognition, to notice when you’re operating on assumption rather than knowledge, and to flag that distinction out loud. That’s a skill. And it correlates, in study after study, with actual competence.

We’ve explored the loneliness of competence before, and this connects directly. The person who knows what they don’t know often finds themselves isolated. They can’t participate in the collective confidence game that defines so many professional environments. When everyone else in the room is projecting certainty, the person who pauses and wants to examine the data before committing reads as hesitant, not rigorous.

What distinguishes them is that they’ve somehow broken through the universal tendency toward overestimation. They’ve developed the internal monitoring system that lets them distinguish between genuine knowledge and assumed knowledge. That distinction is subtle. Acting on it takes a form of courage that looks nothing like the boldness our culture celebrates. It looks like pausing before answering, double-checking information, and asking questions that reveal uncertainty to a room full of people performing confidence. It’s quiet. It’s functionally invisible. And it is the single most reliable predictor of someone who actually knows what they’re doing.

What This Looks Like in Practice

I spent five years on Capitol Hill working on space policy through the Senate Commerce Committee, and the thing that struck me most about effective policymakers was not how much they knew. It was how precisely they understood the borders of their knowledge. The best senators on the committee could tell you exactly which parts of a NASA budget proposal they understood and which parts they needed a staffer or expert to interpret. They didn’t pretend propulsion engineering was their specialty. They asked pointed questions.

The less effective ones were often the most confident. They’d make sweeping claims about program viability or technical readiness based on a single briefing. Not because they were dumb. Most of them were extremely smart. They simply hadn’t developed the metacognitive habit of distinguishing between understanding something and having heard about it once.

Metacognition is trainable. That’s one of the most practically useful findings in this body of research. As the Psychology Today analysis notes, the Dunning-Kruger effect is not permanent. People who receive targeted training in a skill don’t just get better at the skill; they get better at recognizing the limits of their ability. Improving competence improves self-assessment. The two develop together.

This suggests a counterintuitive approach to professional development. Instead of training people to be more confident, organizations might benefit from training people to be more accurate about what they know and don’t know. That’s a harder sell in a culture that treats confidence as currency. But it produces better decisions.

What Organizations Get Wrong

Most institutional cultures are built to reward the opposite behavior. Promotion systems, hiring interviews, leadership assessments: all of them are biased toward people who project certainty. The person who hedges, qualifies, and admits limitations reads as weak in a system calibrated to detect strength.

This creates a selection effect. Over time, organizations accumulate leaders who are good at seeming certain and filter out people who are good at being accurate. The consequences compound. Decisions get made with false confidence. Mistakes don’t get flagged early because the culture punishes uncertainty. And the people who could have caught the problem stay quiet because they’ve learned that admitting uncertainty costs more than being wrong does.

I’ve watched this dynamic play out in congressional budget negotiations. The staffers who were most useful were the ones who would mark up a briefing document with handwritten notes about which claims they could verify and which ones they couldn’t. The ones who caused problems were the ones who summarized everything with equal confidence, flattening the distinction between what they’d confirmed and what they’d assumed.

The best leaders I’ve worked with understood this instinctively. They asked questions that probed for areas of uncertainty in the analysis. “What don’t we know yet?” is a metacognitive prompt. It forces the room to distinguish between knowledge and assumption. And it creates space for the people who know what they don’t know to speak without penalty.

Organizations that want better outcomes don’t need louder voices in the room. They need systems that treat the admission of uncertainty as signal rather than noise. That means restructuring how they evaluate talent, how they run meetings, and how they define leadership itself. It means promoting the person who says “I’d need to check on that” over the person who improvises an answer that sounds authoritative. It means building cultures where the most valuable thing someone can say is “here’s where my knowledge runs out.”

The Threshold Most People Never Cross

Crossing the metacognitive threshold requires three things simultaneously. First, the cognitive ability to monitor your own reasoning. Second, the emotional security to tolerate not knowing. Third, the social willingness to express uncertainty in environments that punish it.

Most people can develop the first capacity with training. The second is harder because it’s tangled up with identity, ego, and the terror of appearing incompetent. The third is the most difficult of all because it requires you to accept a real social cost for being honest about your limitations.

That’s why the people who do all three aren’t being modest. They’re demonstrating a form of competence that integrates cognitive skill, emotional regulation, and social courage. It’s one of the rarest combinations in professional life. It looks like nothing from the outside. From the inside, it’s the hardest thing.

The research, properly read, doesn’t say that dumb people think they’re smart. It says that almost everyone overestimates themselves, and the people who don’t are doing something psychologically distinctive. They’ve crossed a line that most people never approach because the incentives all point the other way.

When someone in your next meeting admits uncertainty, pay attention. That quiet moment when the room goes still and then moves on? That’s the sound of the most competent person in the room being ignored. The organizations that learn to hear it will make better decisions. The ones that don’t will keep promoting the loudest voice and wondering why things keep going wrong.

Photo by Vanessa Garcia on Pexels


Read more from original source...

Interested in Space?

Hit the buttons below to follow us...