The Uncomfortable Luxury of Not Knowing

Feb 1, 2026

An essay on humility, conversation, and the systems that punish uncertainty

Let's face it, we're all tired of pretending we know things we don’t.

Here's a test: You're at a dinner with some acquaintances. Someone brings up the latest on “AI is gonna take over the world,”or maybe it's something about rising sea levels, or the geopolitical implications of some breaking news event. Watch what happens. Within thirty seconds, everyone has an opinion. Confident, articulated, ready to defend. Very few would say, "I don't know enough about this to have an informed take." Nobody truly admits they scrolled past three headlines and retained maybe two sentences.

Why?

Because in the economy of social interaction, "I don't know" is expensive. It costs you attention, credibility, status. It marks you as unprepared, disengaged, or worse: intellectually inadequate.

So we perform certainty instead.

We cobble together half-remembered facts, retrofit our existing worldview onto new information, speak with a confidence that has no relationship to our actual understanding. 

I'm calling this conversational theater. 

This performance of certainty creates cascading problems: it promotes shallow thinking in relationships, selects for the wrong people in hiring, and costs us the very humility required for actual learning.

It's totally understandable, not a personal flaw, when people can't just say, "I don't know." That reaction is built right into systems that are set up to punish anyone who shows a bit of uncertainty.

Why We Can't Shut Up

The Attention Economy of Conversation

Let's start with a fundamental reframe that may initially seem counter-intuitive: conversation is not a neutral exchange of information. 

It's a competitive marketplace for a scarce resource: attention.

Sociologist Charles Derber spent years analyzing over 1,500 informal interactions to understand how attention flows in everyday dialogue. His finding: in individualistic societies like the United States, where social support systems are fragmented, the attention of others becomes a primary form of social currency. It's essential for validating your ego and social standing. And because attention is finite in any given interaction, people compete for it, often unconsciously.

Derber called this "conversational narcissism" in his book, The Pursuit of Attention. And it's crucial to understand he's not talking about clinical narcissism or malicious intent. This is a pervasive social habit, something that happens subtly among friends, family, and colleagues who genuinely care about each other.

The mechanism works through two types of responses: (1) the "shift-response" and (2) the "support-response."

The shift-response is the primary tool of conversational narcissism. It acknowledges what someone said but immediately pivots to redirect attention back to the responder. It creates conversations where people are essentially waiting for their turn to talk rather than actually listening.

Here's what it looks like:

  • Person A: "I'm feeling really overwhelmed with this new project."

  • Person B: "I know, I'm so swamped too. I have three deadlines this week and my boss is breathing down my neck."

Person B has technically stayed on topic (work stress), but they've successfully stolen the spotlight. The conversation is now about them.

In contrast, the support-response maintains focus on the original speaker:

  • Person A: "I'm feeling really overwhelmed with this new project."

  • Person B: "That sounds tough. What part of it is causing the most stress?"

Why does the shift-response dominates? Research shows that talking about yourself activates the brain's reward centers, the same dopamine pathways triggered by food and other pleasurable experiences. The study found that 30-40% of daily conversation is spent sharing personal, subjective experiences. To say "I don't know" or to ask a question (a support-response) is to voluntarily forgo this neurological reward. It's an act of conversational altruism that goes unrewarded in a system designed for self-promotion.

Think about the last group convo you had. Maybe it was friends deciding on a restaurant. Watch how quickly it goes from "Where should we go?" to a competition of "Well, I heard X place is good..." or "I went to this spot last week and..." Nobody says "I haven't tried enough restaurants in this area to have a strong opinion." The person who speaks first with confidence often wins, regardless of actual expertise or whether their suggestion is genuinely good.

Now layer onto this the fact that we're doing this constantly, in every interaction, often without realizing it. The person who consistently says "I don't know" or "Tell me more about that" is essentially opting out of the attention economy. They're choosing to be invisible.

The Dunning-Kruger Performance

If conversational narcissism explains why people speak, the Dunning-Kruger Effect explains what they often say: confidently incorrect assertions.

The Dunning-Kruger Effect describes a cognitive bias where individuals with low ability in a specific domain massively overestimate their competence. The original 1999 study by Cornell psychologists David Dunning and Justin Kruger found that participants scoring in the bottom quartile of performance on tests of logic, grammar, and humor routinely estimated they were above the 60th percentile. Those in the 12th percentile self-rated their expertise at the 62nd percentile on average.

This happens because of what the researchers call a "dual burden": the skills required to produce a correct answer are virtually identical to the skills required to recognize a correct answer. If you lack the knowledge to do something well, you also lack the knowledge to realize you're doing it badly. As Dunning and Kruger wrote, "Those with limited knowledge in a domain suffer a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it."

This creates what's sometimes referred to as "the Peak of Mt. Stupid": maximum confidence at minimum knowledge.

Meanwhile, people with intermediate knowledge enter what's called "the Valley of Despair." They've learned enough to realize how much they don't know. They see the complexity, the exceptions, the gaps in their understanding. So they hedge: "I think maybe..." "It could be..." "I'm not entirely sure, but..."

And here's the cruel irony: high performers often underestimate their relative competence due to the "false-consensus effect." They assume tasks easy for them are easy for everyone. This leads them to express more doubt, which gets socially misread as lack of knowledge.

Let me make this concrete. Imagine a job interview for a technical role. The interviewer asks about a complex methodology.

  • Candidate A (actually at "Peak of Mt. Stupid"): "Oh yes, I'm very familiar with that approach. I'd implement it by [proceeds to give a confident but fundamentally flawed explanation]."

  • Candidate B (actually more knowledgeable, in "Valley of Despair"): "I have some experience with the underlying principles, but I haven't used that exact methodology in production. Here's what I understand about the problem space, and here's what I'd want to investigate further..."

Most interviewers would probably rate Candidate A higher. The confidence signals competence, even though it's actually signaling the opposite. Candidate B's humility, which is really just accurate metacognitive awareness, gets penalized.

The social implication is devastating. "I don't know" is often a marker of intermediate-to-high competence. 

It's a sign you've progressed beyond unconscious incompetence and recognize the vastness of the subject. But in the conversational marketplace described by Derber, this hesitation gets punished while the unearned confidence of the novice gets rewarded with attention, jobs, and even social status.

The Digital Algorithm Demands a Take

If face-to-face conversation incentivizes performing certainty, social media has turned that incentive into an absolute requirement for visibility.

Renée DiResta, formerly at Stanford's Internet Observatory and now at Georgetown's McCourt School of Public Policy, has extensively researched how platforms like Twitter/X, Facebook, and TikTok are engineered to prioritize "high-arousal" content, specifically content that elicits outrage, validation, or shock. The algorithm effectively imposes a tax on uncertainty and nuance.

In her book Invisible Rulers, DiResta notes that the influencer economy rewards people who can "drive narratives" rather than those who offer careful analysis. Posts that express ambivalence ("I'm still thinking about this" or "It's complicated") generate lower engagement metrics—fewer likes, shares, comments—than posts expressing moral or intellectual certainty ("This is unequivocally wrong" or "Here is the truth").

This creates a quantitative feedback loop. Users learn, often unconsciously, to adopt more extreme and certain positions to maintain their audience. Over time, influencers and regular users alike become caricatures of their own opinions, trapped in a performance of certainty to satisfy the algorithmic demand for clarity.

Think about how this plays out in real time. Breaking news happens, a policy announcement, a celebrity scandal, a geopolitical event. Your social media feed immediately fills with confident takes, both in headlines and in the comments section. "This is obviously because of X." "Anyone who thinks Y is delusional." Within two hours, people have hardened positions on something they learned about two hours ago.

The person who says "I'm still gathering information" or "I need to learn more before I have an opinion" either gets ignored (no engagement = algorithmically buried) or gets accused of not caring, of being complicit through silence.

The digital architecture actively suppresses the very humility required for genuine understanding too.

Where Humility Goes to Die

Individual courage matters, but it's not enough. The inability to say "I don't know" isn't just a personal failing, but also it's a structural problem. We've systematically eliminated the environments where practicing humility is safe, and we've replaced them with spaces where every interaction carries professional stakes.

The Disappearance of Low-Stakes Spaces

Sociologist Ray Oldenburg coined the term "third places" to describe informal public gathering spots that are distinct from home (first place) and work (second place). Think: the English pub, the French café, the German beer garden, the American barbershop or corner store. (I wrote briefly about third spaces in my last essay, On Deep Conversation & Connection.)

In his book The Great Good Place, Oldenburg identified specific characteristics that made these spaces crucial for community and genuine connection:

  • Neutral ground: Nobody's obligated to play host. Everyone can come and go freely.

  • Leveling effect: Social and economic hierarchies are suspended. A doctor and a mechanic engage as equals.

  • Conversation is the main activity: The purpose is “riffing” and interaction, not productivity.

  • Regulars create continuity: You see the same people over time, building weak ties and familiarity.

In these spaces, stakes are fundamentally lower. Because your livelihood and social standing aren't explicitly on the line, the pressure to perform certainty dissolved. You could ask "stupid questions." You could admit you didn't know something. You could be wrong without it threatening your professional reputation.

Here's what this looked like in practice:

1970s scenario

You join a woodworking hobby group at a community center. You're not very good. You ask basic questions about techniques. You make mistakes. Nobody cares because the stakes are literally zero. You're just there to learn and hang out with people who share an interest. Over time, you get better. More importantly, you develop comfort with the learning process, with not knowing, with being a beginner.

2025 scenario

You want to join a woodworking group. You find one on Meetup. Before going, you watch YouTube tutorials so you don't look incompetent. At the meetup, people are taking Instagram stories. You're careful about what questions you ask because someone mentioned they work in the same industry as you—this could be a networking opportunity. You catch yourself performing competence even in your hobby. The space that should be low-stakes has become another performance arena.

These "weak ties,"or relationships that are less demanding but consistent, are crucial for social integration and exposure to diverse perspectives. Third places were the primary infrastructure for weak tie formation. And we've let that infrastructure decay.

Where exactly are we supposed to practice the art of being uncertain, of not knowing, when every physical space has been optimized for individual productivity and every interaction might be a networking opportunity?

The LinkedIn-ification of Identity

The structural problem isn't just physical: it's also digital, and thus, psychological. We've entered an era of "personal branding" where every interaction is potentially portfolio-worthy, and humility has become a luxury most people can't afford.

Social media platforms, particularly LinkedIn, have created an environment where your professional identity is always on display, always being curated, always being judged. The "thought leadership" economy rewards people who have thoughts, specifically, confident thoughts, on everything.

Watch what happens when someone tries to express genuine uncertainty on LinkedIn:

  • What someone wants to post: "I tried a new approach to [problem] and it completely failed. I'm genuinely not sure what I should have done differently or what I'm missing. Has anyone dealt with something similar? I could use some perspectives."

  • What they actually post: "I tried a new approach to [problem] and initially faced setbacks, but here are 5 transformative lessons I learned that made me a better leader! 🚀💡 #GrowthMindset #Leadership #Innovation"

Every failure must be a learning experience. Every uncertainty must be reframed as strategic thinking. Every admission of not-knowing must be packaged as curiosity-driven innovation. 

This is audience capture at scale. Users learn that vulnerability without redemption doesn't perform well algorithmically. Posts expressing genuine confusion or ongoing struggle get less engagement than posts presenting a neat narrative arc from problem to solution. So people adapt. They save the messy, uncertain parts for private conversations (if they admit them at all) and present only the polished, certain version publicly.

The cost is that we're all becoming caricatures of professional competence. And newer workers, watching these performances, internalize them as the standard. They think everyone else has it figured out because that's all they see.

When every interaction is potentially "on the record" (thanks to digital permanence), when your professional identity is always being curated and judged, the option to say "I don't know" becomes structurally expensive in a way it wasn't before.

Aside: Humility in Hiring

A note for founders and hiring managers building teams

The consequences of our certainty obsession aren't just conversational awkwardness. They fundamentally distort how we identify and select talent. Having spent time building psychometric frameworks and talent assessments over the last 5 years in different environments, I've seen this up close: our hiring systems reward performance of confidence over actual competence, and we're systematically selecting for the wrong people.

The Interview as Theater

Traditional job interviews are auditions. You're performing the role of "ideal employee" by answering questions that relate to the job. The interviewer is evaluating your performance skills, not necessarily your ability to do the job (unless they are trained properly).

Research on personnel selection by Kausel and colleagues found that managers presented with unstructured interview information exhibited significantly more overconfidence than those presented with test scores alone. The interviews actually diluted the predictive validity of standardized assessments, making hiring decisions less accurate while making decision-makers feel more certain.

The standard interview format is almost perfectly designed to advantage people who are good at performing certainty. Behavioral questions ("Tell me about a time you failed") have become so predictable that there are entire industries built around coaching people to craft redemption narratives. Everyone has a pre-packaged story where their "failure" was really just a stepping stone to growth and success.

Nobody says: "I failed at that project and honestly, I'm still not entirely sure what I should have done differently. I have theories, but I'm genuinely uncertain." That answer, despite being potentially the most intellectually honest and self-aware, would probably disqualify you (unless, again, the interviewer is trained properly).

Confidence Wins Over Competence

A Stanford study by Margaret Neale and Peter Belmi provides particularly damning evidence. They found that people from upper-class backgrounds display unearned overconfidence, and (here's the sad part) observers consistently rated these confident individuals as smarter and more hirable despite there being no objective evidence to suggest this.

"Being overconfident generally pays off," the researchers concluded, "and it's those who already have the most advantages who often benefit."

This creates a compounding problem in startup hiring specifically. Early-stage companies often rely heavily on "culture fit" interviews and unstructured conversations, precisely the formats most susceptible to confidence bias. Founders, stretched thin and trusting their instincts, may overestimate their ability to "just get a feel for someone" in a casual chat.

The result? Teams that select for people who are good at seeming competent rather than people who are actually good at learning, adapting, and honestly assessing problems, the skills that matter most in early-stage environments where nobody has all the answers. (I wrote about this in a past article, Why Every Startup Needs “I-O Psych” (Wait, What’s That?!)

What We Should Actually Be Selecting For

Research on intellectual humility, defined as the recognition that your beliefs might be incorrect and a willingness to revise them, suggests it's actually a better predictor of learning, adaptability, and long-term performance than confidence.

A 2020 study by Porter and colleagues found that intellectual humility predicts "mastery behaviors" when learning: seeking challenge, exerting effort, and persisting through setbacks. The effect size was significant even after accounting for growth mindset and other factors.

Workplace research further suggests that intellectually humble employees demonstrate greater information-seeking and openness to learning, and that leader humility has positive impacts on team performance—particularly among younger workers who value authentic leadership.

Think about it through a team-based scenario:

Team A is filled with confident "move fast and break things" types. When bugs emerge or problems surface, nobody wants to admit they might have caused it. Issues get hidden, blamed on others, or explained away. Information doesn't flow honestly because everyone's protecting their image. The team might move fast initially, but they break things (repeatedly) and don't learn from it.

Team B has a culture where, "I don't know if my approach is optimal," or, "I might have made an error here," is normalized. Bugs get surfaced early. Problems are treated as collective learning opportunities rather than individual failures. Information flows honestly because psychological safety is high. The team learns faster, iterates more effectively, and produces better outcomes.

This mirrors the findings from Amy Edmondson's research on psychological safety in hospital teams. She discovered that the "best" teams (as rated by supervisors) appeared to make more medication errors than lower-performing teams. Paradox? No. The better teams weren't making more errors: they were reporting more of them. In lower-performing teams, a climate of fear and judgment caused nurses to hide mistakes or uncertainties to avoid punishment. In high-performing teams, the safety to say "I made a mistake" or "I'm not sure" allowed for early detection and correction.

"I don't know" is a critical data point for collective intelligence. When uncertainty is suppressed, systems lose the ability to learn and correct themselves.

What Better Hiring Looks Like

So what should hiring look like if we actually wanted to select for learning and adaptability rather than performance?

Traditional interview approach: "Tell me about your experience with [specific technical skill]." Candidate feels pressure to exaggerate experience or fake knowledge.

Better approach

"Here's a problem in [domain you're hiring for]. You haven't seen it before. Walk me through how you'd think about solving it. What information would you want? What assumptions would you make? What would you be uncertain about?" [Candidate has permission to think out loud, ask questions, and admit knowledge gaps.]

The difference is that the first approach selects for people who are good at having answers. The second selects for people who are good at finding answers—a far more valuable skill in a world where information changes rapidly.

We should be evaluating the quality of questions candidates ask, not just the polish of their answers. How they handle uncertainty and ambiguity. Their metacognitive awareness: do they know what they don't know? Their curiosity and learning orientation. How they think, not just what they know.

But that requires fundamentally rethinking what we're selecting for. And it requires acknowledging that our current systems don't just fail to select for these qualities—they actively select against them.

The Beautiful Mess of Admitting Uncertainty

The picture I've painted is pretty bleak. But here's something that gives me hope.

Research on the "Beautiful Mess Effect" by Bruk, Scholl, and Bless reveals something counterintuitive: we systematically overestimate how negatively others judge our vulnerability and uncertainty. When you say "I don't know," you experience it from the inside: you feel the discomfort, the exposure, the anxiety. You're focused on all the visceral, negative details.

But others experience your admission from the outside. They view it more abstractly, and they tend to gloss over the "messy" parts while seeing the "beauty" of authentic human connection. Research shows they interpret your vulnerability as courage, authenticity, and relatability: the exact opposite of what you fear.

Your internal experience of saying "I don't know": Excruciating. You feel exposed. You're convinced everyone now thinks you're incompetent.

Actual research-based prediction of others' reaction: You've just demonstrated metacognitive accuracy, problem-solving orientation, and intellectual honesty. Most people will rate your credibility higher, not lower.

The discomfort you feel is much more intense than the actual social cost. (With the caveat that this is truer for some people than others based on identity and status, but that's a structural problem to fix, not a reason to abandon humility.)

The Uncomfortable Luxury

Let's return to where we started: that dinner party where someone asks about AI takeover or whatever the trending topic is.

What if one person said, "I don't know enough about this to have an informed opinion"?

And what if, instead of that person being dismissed or the conversation moving on without them, others felt permission to do the same? What if that admission opened space for actual curiosity instead of performative certainty?

"I don't know" could become an invitation: 

Let's figure it out together. What do we actually need to understand here? What information are we missing?

We should build towards that culture. And pretending we are, or pretending that individual courage alone can overcome structural incentives to perform certainty, is naive.

But maybe the work isn't to make everyone humble. It's to make humility “affordable” for everyone.

To build spaces, like physical third places, organizational cultures, interview processes, where not knowing is the beginning of learning, not the end of credibility.

To create systems where intellectual humility is actually valued, not just given lip service in mission statements while being penalized in practice.

To recognize that the person who says, "I don't know," is often more competent than the person who confidently asserts… and to design our selection systems accordingly.

(Sorry to be sappy, I wanted to be inspirational.)

Next time you're tempted to fill silence with certainty, pause. 

Ask yourself: Do I actually know this, or am I performing knowledge? And why do I feel like I have to?

The research is clear: the conversations where we admit we're lost are, paradoxically, the only ones that can help us find our way. The teams where uncertainty is safe are the teams that actually learn. The people who know what they don't know are the people who figure things out.

We just have to work towards a self-aware culture that stops punishing them for it.

All work my own. Reuse or distribution with permission, only.

Copyright © 2026 Benjamin Wong. All Rights Reserved.