Omission Bias

aka Omission-Commission Asymmetry · Inaction Bias

Judging harmful actions as worse than equally harmful inactions, preferring to do nothing even when acting would be better.

WHAT IT IS

The glitch, explained plainly.

Imagine two kids at a pool. One kid pushes another kid into the water, and the other kid sees someone falling into the water but doesn't grab their hand to help. The kid in the water gets equally wet both times, but almost everyone thinks the pusher is the 'bad' one, even though the kid who just watched and did nothing let the same bad thing happen.

Omission bias describes the systematic preference for harm caused by inaction over equal or lesser harm caused by action. People evaluate someone who actively does something harmful as more blameworthy, more causal, and more immoral than someone who passively allows the same harm to occur by doing nothing. This asymmetry persists even when the outcomes are objectively identical or when inaction produces demonstrably worse consequences. The bias operates in both self-interested decision-making (where people avoid acting to escape potential regret) and in moral judgments of others (where observers assign less blame to those who failed to act than to those who acted).

SOUND FAMILIAR?

Where it shows up.

  1. 01 A parent reads that a new vaccine prevents a disease that kills 10 in 100,000 children. The vaccine itself causes a fatal reaction in 1 in 100,000 children. Despite the vaccine being statistically safer, the parent refuses to vaccinate, saying 'I couldn't live with myself if my child died from something I chose to do to them.'
  2. 02 A hiring manager knows that keeping an underperforming team member is dragging down productivity and morale for the entire department. She has a strong replacement candidate lined up. But she delays the termination month after month, telling herself 'at least I'm not the one ruining his career,' even though the team's output continues to deteriorate.
  3. 03 A financial advisor has a client whose portfolio is heavily concentrated in a single declining stock. Rebalancing into diversified funds would statistically reduce risk. The advisor avoids recommending the switch, reasoning that if the diversified funds also drop, the client will blame him for actively making the change — whereas if the original stock keeps declining, it will feel like 'just the market.'
  4. 04 A city council member votes against a flood barrier project that would save an estimated 200 homes but require demolishing 15 homes in the construction path. She argues that actively destroying 15 homes is unconscionable, even though failing to build the barrier will likely result in 200 homes being destroyed by the next major flood.
  5. 05 A software engineering lead discovers a critical security vulnerability in production. Deploying a patch carries a small risk of causing 30 minutes of downtime for users. He decides to 'monitor the situation' rather than deploy the patch, because if the patch causes downtime he'll be blamed for breaking things, whereas if hackers exploit the unpatched vulnerability it will seem like an external attack no one could have prevented.
IN DIFFERENT DOMAINS

Where it shows up at work.

The same glitch looks different depending on the terrain. Finance, medicine, a relationship, a team — same mechanism, different costume.

Finance & investing

Investors hold onto declining assets rather than actively selling at a loss, because the realized loss from selling feels like a self-inflicted wound, while the paper loss from inaction feels like an external event happening to them. This contributes to the disposition effect and portfolio inertia.

Medicine & diagnosis

Physicians and patients systematically prefer watchful waiting over active treatment when both carry equivalent risk, because adverse outcomes from treatment are perceived as iatrogenic harm while adverse outcomes from non-treatment are perceived as the natural course of disease. This is particularly pronounced in vaccine hesitancy, where parents judge vaccine side effects as far worse than equivalent disease complications.

Education & grading

Teachers may avoid intervening when they notice a student being socially excluded, because actively restructuring group dynamics risks making things worse and being blamed for it, while passively allowing the exclusion to continue feels like a less culpable position.

Relationships

People avoid having difficult but necessary conversations — about boundaries, unmet needs, or dealbreakers — because actively raising the issue risks causing a fight they'd feel responsible for, while letting the relationship slowly deteriorate through inaction feels like something that 'just happened.'

Tech & product

Product teams resist removing underperforming features or sunsetting legacy products because actively discontinuing something users rely on feels more harmful than passively allowing technical debt and fragmented user experiences to accumulate. Default opt-in settings exploit this bias, knowing users rarely take the active step of opting out.

Workplace & hiring

Managers delay firing poor performers, restructuring teams, or canceling failing projects because taking action that visibly harms someone feels worse than allowing the broader harm of inaction to continue invisibly. Performance review systems that require active ratings ('rate this person') face more resistance than those with default categories.

Politics Media

Legislators vote against policies that would save more lives than they cost (e.g., mandatory safety regulations) because the visible victims of regulation feel like the legislator's fault, while the statistical victims of non-regulation are attributed to circumstance. Media coverage reinforces this by extensively covering harms caused by government action while under-reporting harms caused by government inaction.

HOW TO SPOT IT

Ask yourself…

  • Am I avoiding an action primarily because I'd feel more responsible if it went wrong, even though doing nothing also has consequences?
  • If the harm from my inaction were attributed to me just as directly as harm from my action, would I still choose to do nothing?
  • Am I treating 'not deciding' as if it were not a decision at all?
HOW TO DEFEND AGAINST IT

The playbook.

  • Reframe inaction as a choice: explicitly write down 'If I do nothing, the expected outcome is X' alongside 'If I act, the expected outcome is Y' to make the consequences of omission visible.
  • Apply the 'newspaper test' symmetrically: ask yourself whether you'd be comfortable if your inaction and its consequences were reported with the same scrutiny as your action would be.
  • Use a pre-mortem for both options: imagine both action and inaction have led to the worst possible outcome, then evaluate which failure you'd find more defensible.
  • Assign explicit accountability for inaction: in team settings, formally record the decision not to act and who made it, to counteract the illusion that inaction carries no ownership.
  • Ask: 'If someone else were doing nothing in my position, would I judge them as negligent?' to leverage the third-person perspective.
FAMOUS CASES

In history.

  • Resistance to the pertussis (whooping cough) vaccine in the 1980s-90s: parents refused vaccination despite the disease being far more dangerous than the vaccine, because actively vaccinating and risking side effects felt worse than passively risking the disease.
  • Debates over active vs. passive euthanasia in medical ethics and law: the U.S. Supreme Court in Vacco v. Quill (1997) upheld the legal distinction between withdrawing life support (permitted) and physician-assisted suicide where a patient self-administers a lethal prescription (prohibited), even when patient outcomes and intentions are similar.
  • Hurricane seeding controversy: decision-makers resisted cloud-seeding interventions that could reduce hurricane damage, because if seeding altered the hurricane's path and harmed different people, the harm would feel like their fault — whereas natural hurricane damage would not.
WHERE IT COMES FROM
Academic origin

Ilana Ritov and Jonathan Baron (1990) first studied the phenomenon empirically in vaccination decisions. Mark Spranca, Elisa Minsk, and Jonathan Baron formalized the concept as 'omission bias' in their 1991 paper in the Journal of Experimental Social Psychology.

Evolutionary origin

In ancestral environments, novel actions carried unpredictable physical risks — trying an unknown food, confronting a predator, or altering a shelter. Inaction preserved the status quo, which had at least allowed survival thus far. A generalized heuristic of 'when in doubt, do nothing' reduced exposure to novel dangers. Furthermore, small-group social dynamics punished visible harmful actions more than passive failures, so a reputational incentive to avoid commissions over omissions was reinforced.

IN AI SYSTEMS

How the machines inherit it.

AI systems trained on human moral judgments inherit omission bias, generating recommendations that systematically favor inaction over action even when action would produce better outcomes. Language models asked to advise on moral dilemmas tend to endorse doing nothing more strongly than human participants. In content moderation, AI may be calibrated to over-penalize active harmful speech while under-detecting harmful silence or failure to correct misinformation, reflecting the human tendency to treat commissions as worse than omissions.

Read more on Wikipedia
FREE FIELD ZINE

10 glitches quietly running your life.

A free field-zine PDF — ten cognitive glitches named, illustrated, with a defense move for each. Plus the weekly Glitch Report on Fridays — one bias named, two spotted in the wild, one defense move. Unsubscribe any time.

EXPLORE MORE

Related glitches.

LAUNCH PRICE

Train against your blindspots.

50 cards are free to preview. Buyers unlock the rest of the deck plus the interactive training — Spot-the-Bias Quiz unlimited, Swipe Deck with spaced repetition, My Blindspots, Decision Pre-Flight, the Printable Deck + Cheat Sheets, and the Field Guide e-book. $29.50$59.

Unlock the full deck

Everything below — yours forever. Pay once, use across every device.

Half-off launch — limited to the first 100 readers. Auto-applied at checkout.
$59 $29.50
one-time payment · lifetime access
  • All interactive digital cards — search, filter, flip, shuffle on any device
  • Five training modes — Spot-the-Bias Quiz, Swipe Deck, Pre-Flight, Blindspots, Journal
  • Curated Lenses + Decision Templates + Defense Playbook
  • Printable Deck PDFs + Field Guide e-book + Cheat Sheets + Anki Export
  • Every future improvement, included
Unlock  $29.50

30-day refund · no questions asked

Unlock the full deck

Everything below — yours forever. Pay once, use across every device.

Half-off launch — limited to the first 100 readers. Auto-applied at checkout.
$59 $29.50
one-time payment · lifetime access
  • All interactive digital cards — search, filter, flip, shuffle on any device
  • Five training modes — Spot-the-Bias Quiz, Swipe Deck, Pre-Flight, Blindspots, Journal
  • Curated Lenses + Decision Templates + Defense Playbook
  • Printable Deck PDFs + Field Guide e-book + Cheat Sheets + Anki Export
  • Every future improvement, included
Unlock  $29.50

30-day refund · no questions asked