Normalcy Bias

aka Normality Bias · Negative Panic · Analysis Paralysis

Assuming life will continue as normal despite clear warning signs, underestimating both the likelihood and severity of disruption.

WHAT IT IS

The glitch, explained plainly.

Imagine you've walked the same path to school every single day and nothing bad has ever happened. Then one day, someone tells you a big scary dog is loose on that path. Instead of taking a different route, you think, 'Nah, nothing bad ever happens on my walk,' and you keep going the same way — because your brain thinks tomorrow will be just like yesterday.

Normalcy bias describes the deeply ingrained human tendency to default to the assumption that the future will resemble the past, even when confronted with compelling evidence of an impending disruption. People affected by this bias interpret warning signals through the lens of prior safe experience, dismissing or reframing alarming information to fit their existing model of normalcy. This leads to delayed evacuation, inadequate preparation, and a dangerous reliance on the idea that because something catastrophic has never happened before, it never will. Research consistently finds that approximately 70–80% of people display normalcy bias during emergencies, often entering a 'milling' phase where they seek confirmation from multiple sources before taking any protective action.

SOUND FAMILIAR?

Where it shows up.

  1. 01 A coastal town receives an official tsunami warning after a major offshore earthquake. Maria, who has lived there for 30 years and survived several earthquake scares without any tsunami ever materializing, tells her family there's no need to drive to higher ground. 'They always overreact with these warnings,' she says, continuing to prepare dinner as the sirens blare outside.
  2. 02 A cybersecurity team presents the board of directors with evidence that their company's outdated firewall has vulnerabilities identical to those exploited in three recent high-profile breaches at similar firms. The CEO tables the discussion, saying their systems have worked fine for years and a major overhaul isn't justified. 'We've never been breached before — let's revisit this next quarter.'
  3. 03 During the early weeks of a novel respiratory virus outbreak in a distant country, Dr. Patel reads daily WHO updates showing exponential case growth and a concerning mortality rate. However, because no cases have yet appeared in his city, he does not alter his clinic's protocols or order additional PPE supplies. He reasons that international outbreaks rarely reach his region and continues with standard procedures.
  4. 04 An investment analyst notices that the same overleveraged derivative structures that preceded the 2008 financial crisis are reappearing across major banks. She drafts a warning memo but ultimately softens it before sending, concluding that modern regulatory frameworks make a repeat collapse implausible. She reduces her hedging positions, reasoning that the system has been stable for over a decade.
  5. 05 Residents of a valley downstream from an aging dam are informed that engineers have rated the structure as 'deficient' and that heavy spring rains could cause a failure. Most residents attend the town meeting but few make evacuation plans. One lifelong resident summarizes the prevailing sentiment: 'That dam has held for 60 years — my father grew up under it and his father before him. Engineers always find something to worry about.'
IN DIFFERENT DOMAINS

Where it shows up at work.

The same glitch looks different depending on the terrain. Finance, medicine, a relationship, a team — same mechanism, different costume.

Finance & investing

Investors and institutions underestimate the probability of market crashes, asset bubbles bursting, or systemic failures because extended periods of stability make catastrophic scenarios feel implausible. This leads to underhedging, overleveraging, and delayed responses to deteriorating economic indicators.

Medicine & diagnosis

Patients delay seeking medical attention for worsening symptoms because they have historically been healthy, interpreting alarming signs as benign. Clinicians may also underestimate the severity of emerging epidemics or rare diagnoses because their daily experience is dominated by routine cases.

Education & grading

School administrators underprepare for crises such as active shooter events, severe weather, or pandemics because such events have never occurred at their institution. Fire drills and lockdown procedures are treated as bureaucratic formalities rather than rehearsals for genuinely possible emergencies.

Relationships

People remain in deteriorating or abusive relationships far longer than warranted because they normalize escalating warning signs, assuming the relationship will return to how it was during better times. Red flags are reframed as temporary rough patches.

Tech & product

Development teams underinvest in disaster recovery, backup systems, and security hardening because their infrastructure has never experienced a catastrophic failure. Legacy systems remain unpatched because breaches have never occurred, creating mounting technical debt.

Workplace & hiring

Organizations fail to create contingency plans for key-person departures, supply chain disruptions, or regulatory shifts because operations have been stable. Safety protocols in industrial settings become lax when no accidents have occurred for extended periods.

Politics Media

Governments and publics underreact to intelligence warnings about terrorist threats, emerging pandemics, or geopolitical escalation because such events seem remote from daily experience. Media coverage of early warning signs is minimal until the crisis is already unfolding.

HOW TO SPOT IT

Ask yourself…

  • Am I dismissing this warning primarily because 'this has never happened before' rather than because I've evaluated the actual evidence?
  • Am I waiting for other people to react first before I decide this is serious?
  • Am I interpreting ambiguous signals in the most optimistic way possible to avoid disrupting my routine?
HOW TO DEFEND AGAINST IT

The playbook.

  • Pre-commit to action plans before a crisis: decide in advance what specific trigger (e.g., official evacuation order) will cause you to act, removing the need for real-time deliberation.
  • Conduct 'pre-mortems' regularly: vividly imagine the worst-case scenario has already happened and work backward to identify what you failed to do — this makes the threat cognitively available.
  • Actively seek disconfirming evidence: when you catch yourself thinking 'it'll be fine,' deliberately look for reasons it might not be.
  • Assign a 'devil's advocate' in group settings whose explicit role is to argue for the worst-case interpretation of ambiguous information.
  • Reduce the influence of social milling: decide on personal action thresholds independent of what others are doing, and rehearse acting on them.
FAMOUS CASES

In history.

  • Hurricane Katrina (2005): Thousands of New Orleans residents refused to evacuate despite mandatory evacuation orders, having weathered previous hurricanes without major harm.
  • September 11 attacks (2001): Studies found that approximately 70% of WTC survivors spoke with others before evacuating, and many returned to offices to shut down computers before leaving.
  • Eruption of Mount Vesuvius (79 AD): Residents of Pompeii reportedly watched the eruption for hours without evacuating, as the volcano had not erupted in living memory.
  • Sinking of the Titanic (1912): Passengers and crew were slow to board lifeboats, partly because the concept of the ship sinking was outside their operational model. Early lifeboats launched only partially filled.
  • COVID-19 pandemic (2020): Despite early warnings from health authorities, many governments and citizens worldwide initially downplayed the severity, continuing normal activities as the virus spread exponentially.
WHERE IT COMES FROM
Academic origin

The concept emerged from disaster sociology research, notably Thomas E. Drabek's 1986 work 'Human System Responses to Disaster' and Enrico L. Quarantelli's research on disaster behavior in the 1980s. Haim Omer and Nahman Alon formalized the term in their 1994 paper 'The Continuity Principle' in the American Journal of Community Psychology. Earlier Japanese research by Okabe and Mikami (1982) described the phenomenon in the context of disaster cue interpretation.

Evolutionary origin

In ancestral environments, most days were genuinely normal and most perceived threats turned out to be false alarms. An organism that panicked at every rustling bush would waste enormous energy, attract predators through conspicuous flight behavior, and abandon valuable resources unnecessarily. The brain evolved to treat stability as the default prediction and to require strong, repeated evidence before shifting into a costly emergency response mode, conserving metabolic energy and maintaining group cohesion.

IN AI SYSTEMS

How the machines inherit it.

Machine learning models trained predominantly on historical data inherit a form of normalcy bias by learning that the statistical baseline is the most likely outcome. This makes models poor at predicting rare, unprecedented events (black swans) such as financial crashes, pandemics, or novel attack vectors. Anomaly detection systems calibrated on long stretches of normal data may have thresholds set too high, dismissing genuine emerging threats as noise because they deviate from the trained 'normal' distribution.

Read more on Wikipedia
FREE FIELD ZINE

10 glitches quietly running your life.

A free field-zine PDF — ten cognitive glitches named, illustrated, with a defense move for each. Plus the weekly Glitch Report on Fridays — one bias named, two spotted in the wild, one defense move. Unsubscribe any time.

EXPLORE MORE

Related glitches.

LAUNCH PRICE

Train against your blindspots.

50 cards are free to preview. Buyers unlock the rest of the deck plus the interactive training — Spot-the-Bias Quiz unlimited, Swipe Deck with spaced repetition, My Blindspots, Decision Pre-Flight, the Printable Deck + Cheat Sheets, and the Field Guide e-book. $29.50$59.

Unlock the full deck

Everything below — yours forever. Pay once, use across every device.

Half-off launch — limited to the first 100 readers. Auto-applied at checkout.
$59 $29.50
one-time payment · lifetime access
  • All interactive digital cards — search, filter, flip, shuffle on any device
  • Five training modes — Spot-the-Bias Quiz, Swipe Deck, Pre-Flight, Blindspots, Journal
  • Curated Lenses + Decision Templates + Defense Playbook
  • Printable Deck PDFs + Field Guide e-book + Cheat Sheets + Anki Export
  • Every future improvement, included
Unlock  $29.50

30-day refund · no questions asked

Unlock the full deck

Everything below — yours forever. Pay once, use across every device.

Half-off launch — limited to the first 100 readers. Auto-applied at checkout.
$59 $29.50
one-time payment · lifetime access
  • All interactive digital cards — search, filter, flip, shuffle on any device
  • Five training modes — Spot-the-Bias Quiz, Swipe Deck, Pre-Flight, Blindspots, Journal
  • Curated Lenses + Decision Templates + Defense Playbook
  • Printable Deck PDFs + Field Guide e-book + Cheat Sheets + Anki Export
  • Every future improvement, included
Unlock  $29.50

30-day refund · no questions asked