...the who's who,
and the what's what 
of the space industry

Space Careers

news Space News

Search News Archive

Title

Article text

Keyword

  • Home
  • News
  • There’s a kind of confidence that only develops after you’ve been publicly wrong about something you cared about deeply and chose to stay in the room anyway

There’s a kind of confidence that only develops after you’ve been publicly wrong about something you cared about deeply and chose to stay in the room anyway

Written by  Dr. James Whitfield Friday, 03 April 2026 08:42
There's a kind of confidence that only develops after you've been publicly wrong about something you cared about deeply and chose to stay in the room anyway

The most reliable form of confidence doesn't come from an unbroken track record of success. It develops after you've been publicly wrong about something that mattered and chose to stay engaged with the people who witnessed it.

The post There’s a kind of confidence that only develops after you’ve been publicly wrong about something you cared about deeply and chose to stay in the room anyway appeared first on Space Daily.

The confidence most people recognise, the kind that comes from repeated success, from never stumbling publicly, is the thinnest version of the thing. It breaks under weight. The version that actually holds, the kind you can build a career or a relationship or a crew dynamic on, comes from a different place entirely. It develops after you’ve been wrong, visibly, about something that mattered to you, and you stayed present for the aftermath.

I’ve spent years studying what happens to humans under pressure, including time working with teams in high-stress environments where mistakes carry real consequences and there’s nowhere to hide from them. Watching how people perform when everything they know is being tested, in confined spaces where every error is witnessed by the same small group of people you’ll eat breakfast with tomorrow, gave me a particular view of confidence. The individuals who performed best over long-duration missions weren’t the ones who’d never failed. They were the ones who had failed publicly and processed it without retreating.

astronaut isolation chamber

What Public Failure Actually Does to the Brain

When you’re wrong about something privately, your brain treats it as a small correction. A recalibration. The cognitive cost is minimal because the social stakes are low. But public failure, especially about something you invested in emotionally, triggers a cascade of responses that are fundamentally different.

The shame response activates. Your threat detection systems fire as if you’re in physical danger. Research on psychological resilience and stress responses shows that perceived threats to social standing create anxiety patterns remarkably similar to those produced by genuine threats to physical safety. Studies suggest that the body’s stress response systems can react intensely to social threats, producing physiological reactions similar to those triggered by physical danger.

Now consider what this means in a space-analogue environment, or on an actual mission. The room you want to leave is the same room you’ll be in for the next six months. The people who witnessed your error are your entire social world. The neurological impulse to flee has nowhere to go. This is what makes these environments such powerful laboratories for understanding what I’m describing: the staying isn’t optional, which means the psychological work that follows failure is compressed and unavoidable.

What happens next is where the divergence occurs. In ordinary life, most people leave the room. They switch fields, change subjects, quietly redirect their energy somewhere where nobody remembers the mistake. That’s a rational response. It reduces pain efficiently.

But it doesn’t build anything.

The People Who Stay

During my years working with high-performance teams, I observed something repeatedly in selection and training that never made it cleanly into the published research because it was hard to quantify. The candidates who had experienced significant professional failure before entering the programme, and had stayed in their field anyway, displayed a qualitatively different kind of composure under pressure.

They weren’t calmer. They weren’t more stoic. They were more honest about uncertainty. They said they didn’t know faster, or they acknowledged uncertainty more readily. They flagged their own errors before others noticed. This is a specific behavioural pattern that emerges when someone has already survived the worst-case scenario of being publicly wrong and discovered that it didn’t actually destroy them.

In crew environments, this pattern was unmistakable. During simulation exercises where a team decision led to a systems failure, the members who had prior experience of professional setback were measurably quicker to say “that was my call, and it was wrong” without spiralling into self-recrimination or defensive justification. They treated the error as navigational data. The ones without that prior experience were more likely to freeze, deflect, or quietly disengage, all responses that in a confined operational environment can cascade into larger failures.

Research into psychological resilience and growth mindset supports this observation. Individuals who develop what researchers call a growth mindset, the belief that abilities can develop through effort and learning, show measurably different stress responses when confronted with setbacks. They process failure as information rather than as identity.

But here’s what the growth mindset literature often misses: it’s not just about believing you can improve. It’s about having stood in a room where people saw you fail and choosing to remain available for the next conversation. The belief is secondary. The behaviour is primary.

Why Staying in the Room Is the Operative Phrase

I use the phrase deliberately because it describes something physical, not just psychological. When you’ve been wrong publicly, there is a literal impulse to leave. To stop attending the meetings. To find reasons not to be present where the people who witnessed your failure might be.

In my recent piece on the fear of improvisation, I explored how the need for constant planning often masks a deeper avoidance of situations where you might be caught unprepared. The same mechanism operates after public failure, except now the avoidance has a specific memory attached to it. You’re not avoiding a theoretical risk. You’re avoiding the recurrence of a thing that already happened and already hurt.

Staying anyway is not brave in the dramatic sense. Nobody writes songs about the person who showed up to the next department meeting after their project proposal was torn apart. But it is the exact point where a different kind of confidence begins to form.

Cognitive flexibility, the ability to shift strategies and adapt to new information, plays a direct role here. Research has found that cognitive flexibility was significantly associated with higher performance, with individuals who made fewer perseverative errors (essentially, who stopped repeating failed strategies) achieving better outcomes. The key word is perseverative. It means repeating the same approach despite evidence that it isn’t working.

Staying in the room after public failure is the opposite of perseveration. You’re not repeating the failed strategy. You’re showing up with updated information about yourself and your approach.

The Difference Between Resilience and Stubbornness

This distinction matters, and it’s one I’ve seen confused repeatedly in both organizational psychology and ordinary life. Stubbornness is staying in the room and insisting you were right. Resilience is staying in the room knowing you were wrong and being willing to engage with that reality.

In high-pressure team environments, the individuals who struggled most during challenging simulations were not the ones who lacked technical skill. They were the ones who couldn’t sit with having been wrong. When a team decision turned out badly, some people would immediately begin rewriting the narrative by claiming they had predicted the outcome or that the available data had suggested a different approach, or by pointing to limitations in the available data. These are normal human protective responses. But in a confined environment where you cannot escape the people who know the truth, they corrode trust rapidly. There is no corridor long enough to avoid your crewmate’s memory of what actually happened.

The team members who could acknowledge their error directly and explain their updated thinking without excessive self-flagellation or defensive explanation were invariably the ones other members wanted beside them when the next hard decision came. This is not a minor preference. In environments where your life may depend on the judgment of the person next to you, knowing that they can be honest about their mistakes is not a soft skill. It is a survival criterion.

Research on resilience as a mediating factor in emotional and behavioural outcomes suggests this isn’t just anecdotal. Psychological resilience appears to function as a buffer between stressful experiences and negative outcomes, but only when individuals actively engage with the stressor rather than avoiding it. Avoidance removes the immediate pain. Engagement builds the capacity to handle future pain.

crew teamwork space station

What This Kind of Confidence Actually Looks Like

It doesn’t look like what most people expect. It’s not louder. It’s not more assertive. If anything, it’s quieter.

People who have survived public failure and stayed engaged develop a specific set of behaviours. They ask more questions. They qualify their statements more carefully, not out of uncertainty but out of earned respect for the complexity of the problems they’re working on. They’re less likely to perform certainty they don’t feel.

There’s a useful exploration of emotional legibility that Space Daily published recently, looking at how some people learn early to become unreadable because being understood made them vulnerable. The post-failure confidence I’m describing is different. These people aren’t hiding. They’re simply no longer performing.

When you’ve been publicly wrong and stayed present, the need to project confidence diminishes because you’ve already survived the thing that projected confidence was designed to prevent. You no longer need to appear infallible because you’ve been fallible, publicly, and the world continued.

My Own Education in Being Wrong

I spent years studying these dynamics in other people before I experienced them myself in a way that cut through my intellectual understanding. Early in my career, I published a model of team cohesion under stress that I was deeply invested in. It was elegant. It accounted for variables other models had missed. I presented it at conferences with genuine conviction. And then a longitudinal study, one I’d helped design, produced data that contradicted a central prediction of my own framework.

The error was public. Colleagues I respected had read my papers. Students had cited my work. And the data was clear: I had been wrong about something I’d staked professional credibility on.

The impulse to reframe, to argue that the study conditions were anomalous, to quietly pivot to adjacent questions, was powerful. I understood that impulse intimately because I’d watched it in the people I studied. Knowing about it didn’t make it weaker. That gap between understanding a thing and being subject to it is something I’ve come to see as the central challenge of applied psychology. We can study resilience beautifully. We can measure it, model it, predict it. But the version that matters, the kind that changes how you show up, only forms through exposure. Through being the one who got it wrong.

What I did instead of retreating was stay. I revised the model publicly, acknowledged what the data showed, and continued working in the same field with the same colleagues. It was not dramatic. It was deeply uncomfortable. And it taught me more about the phenomenon I’d been researching than any of the studies I’d designed to investigate it from the outside.

The Metacognitive Dimension

What makes staying in the room after failure transformative rather than merely painful is the cognitive work you do while you’re there. Research with university students found that regulation of cognition, rather than mere knowledge of cognition, was the component that predicted better outcomes. Knowing about your thinking processes wasn’t enough. What mattered was actively managing them: monitoring your strategies, evaluating their effectiveness, adjusting when they weren’t working.

This maps precisely onto what I observed in team dynamics. The individuals who knew they might be wrong (knowledge of cognition) but didn’t act on that awareness performed identically to the ones who never considered the possibility at all. The ones who actively monitored their own reasoning, who checked their assumptions against incoming data, who adjusted their approach based on real-time feedback, were the ones who built genuine trust with their teams.

And the starting point for that active regulation was almost always a previous failure. Not a simulated one. A real one, with real witnesses.

Why Organisations Get This Backwards

Most institutional cultures punish public failure and reward the appearance of consistency. Promotion criteria favour unbroken track records. Annual reviews emphasise what went right. The implicit message is clear: don’t be the one who was wrong.

This creates a specific kind of fragility. Individuals who have never been publicly wrong, or who have been but left before the consequences could teach them anything, carry a hidden vulnerability. They perform confidence fluently. They’re often promoted quickly. But they’re operating without the calibration that real failure provides.

Studies on school climate and psychological resilience indicate that environments which support creative risk-taking predict stronger resilience outcomes. The mechanism isn’t mysterious: when the cost of being wrong is manageable, people are more likely to stay engaged after failure. When the cost is career-ending, they leave. And when they leave, they take with them the possibility of developing the deeper confidence that only post-failure engagement produces.

Space agencies understand this better than most, though imperfectly. Simulation failures are designed into training precisely because the organization needs crew members who have practised being wrong. But there’s a limit to what simulation can teach. The emotional weight of being wrong in front of people whose respect you need cannot be fully replicated in a training environment. This is why the best crews are not assembled from people with flawless records. They’re assembled from people whose records include failures they didn’t walk away from.

The Specific Confidence That Emerges

I want to be precise about what I mean because “confidence” is used so loosely that it covers everything from public speaking ease to delusional self-regard.

The confidence that develops after public failure and continued engagement has three characteristics I’ve observed consistently.

First, it is calibrated. People who have been publicly wrong develop a better sense of what they actually know versus what they assume. Research on mindfulness and mental resilience shows that individuals with higher resilience demonstrate greater awareness of their own cognitive states. Being wrong teaches you to distinguish between conviction and evidence, which is a distinction that untested confidence rarely makes.

Second, it is portable. Because it’s not based on success in a specific domain, it transfers across contexts. Someone who was publicly wrong about a research hypothesis and processed that experience can bring the same groundedness to a relationship disagreement or a career pivot.

Third, it includes vulnerability. People with this kind of confidence are often willing to express uncertainty or acknowledge the possibility of being wrong without it diminishing their authority. Because they’ve already been wrong. Saying it out loud isn’t a confession. It’s just an accurate description of reality.

What This Means for How We Think About Growth

The popular framing of personal growth tends to be additive. You learn new skills. You gain experience. You accumulate knowledge. But the kind of confidence I’m describing develops through subtraction. Through losing certainty you didn’t realise was unearned. Through having something stripped away publicly and discovering what remains.

As I explored in my recent piece on conversational fearlessness, many of the behaviours we admire as courage are really just the residue of someone having already faced the thing most people are still avoiding. The person who speaks uncomfortable truths isn’t necessarily brave. They’ve simply already experienced the consequences of staying silent, and they know which discomfort they prefer.

Post-failure confidence operates on the same principle. You don’t choose it because it feels better than ordinary confidence. You develop it because ordinary confidence already failed you, publicly, and you needed something that would hold.

This is ultimately what my years of studying humans in extreme environments taught me, not through the data but through the pattern that emerged beneath it. The crews that functioned best were not the ones who avoided failure. They were the ones where every member had already been wrong about something that mattered, had stayed present for the fallout, and had come out the other side still willing to make the next call. That willingness, earned rather than assumed, is the foundation of every meaningful form of confidence I’ve encountered.

If you’ve been wrong about something you cared about, in front of people whose opinions mattered to you, and you stayed, you already have the raw material. The confidence is forming whether you feel it or not. The staying was the hard part.

Everything after that is just showing up again tomorrow.

Photo by T Leish on Pexels


Read more from original source...

Interested in Space?

Hit the buttons below to follow us...