Image Credit: https://www.cs.umd.edu
In what seems like a cruel twist of nature, we humans are wired to frequently deceive ourselves. Although there are reasons for this, which I’ll discuss later in this article, psychology, neuroscience and even anthropology have discovered dozens of ways in which our assessment of reality and our decision making are flawed, often seriously so. The reality is that most of us walk around at any given moment wrong about something. These errors come from many categories of processing such as perception and attention, emotion and mood, memory, identity, values, fear, cognitive distortions (mistakes in thinking), social pressure, stress, language, psychoactive substances, neurochemistry, and others that result in most of us coming to incorrect conclusions and making decisions based on those conclusions on a fairly regular basis. And, because of a related vulnerability we often don’t recognize that we are wrong because we attribute things that don’t “go right” to other factors rather than our own faulty processing. In fact, some of our biases are so strong, we actually see contradictory evidence as supporting our incorrect positions!
It is also true that while everyone is susceptible to mistaken perceptions of reality and resultant, flawed decision-making, not everyone is equally susceptible. Some of us, are in fact, more wrong more often. Although this is a dicey topic, we know empirically that a number of factors or domains can conspire to increase or decrease an individual’s capacity for careful, objective analysis while increasing one’s vulnerability to being misled, such as cognitive style, identity, emotional needs, social environment, and mental health. For example, although one does not need extensive formal education to have common sense, formal education does typically provide things like a more expansive vocabulary, media literacy, and knowledge that often supports more effective analysis and decision making than less formal education.
Relatedly, things like a fundamentalist view on a topic or a strong, identity/morality based political affiliation tends to filter out perception of information that might contradict (and threaten) one’s view and/or affiliation, even if that view is empirically wrong. Moreover, certain personality types or mental health issues such as narcissism and paranoia, by definition, facilitate circular thinking and coming to conclusions that are compelling to the person, but not supported by objective evidence. Even things as simple as discomfort with ambiguity or conflict avoidance tend to increase the likelihood that given individuals will gravitate toward “facts” and conclusions that are not empirically supported, but feel more comfortable. When this is combined with charismatic leadership that speaks from a place of high confidence, folks who are already susceptible to believing things that are not empirically supported, become even more vulnerable to being misled and misleading themselves. I will discuss this in more detail later in this article, but the short version is that humans prefer comfort, stability, and predictability over accurate (epistemically derived) perception. Also, importantly, highly “intelligent” people are also susceptible to inaccurate perceptions and conclusions, particularly if they tie their identity to being “right” and/or use their abilities with language and debate to construct sophisticated/complicated rationales that are difficult to deconstruct and challenge, but may be based on little empirical evidence.
So, before we launch into the dozens of ways that we humans deceive ourselves, what combination of factors tends to support more objectively accurate perception and subsequent decision making?
Across domains, the strongest protective factors represent some version of:
Intellectual humility + curiosity + willing exposure to alternative information and disagreement
Individuals who are most likely to be “correct” are people who are intelligent, and:
- Are comfortable with or even enjoy being proven wrong
- Have good capacity for evaluating misinformation
- Do not tie their identity or validity to being right
- Seek disconfirming evidence
- Separate beliefs from objective reality
Of course, a fundamental, at some level unresolvable element of this entire discussion is the notion of truth and objectivity. Do they even exist in an empirical way? I would argue yes, and…
The Nature of Truth
Most people would agree that things that are measurable can facilitate some level of objective truth. For example, if three calibrated thermometers measure an average temperature of 20.5 degrees Celsius, most people would agree that the temperature is very, very close to 20.5 degrees Celsius, particularly if nothing significant is riding on the interpretation of the temperature. Then there is truth related to the implication of the evidence we are evaluating. For example, if you are a partisan fan, your interpretation of replay evidence of whether or not a football player got both feet down inbounds, and thus made a critical first down catch, may vary widely depending on which team is “your” team regardless of how “clear” the video evidence is. Things get even more complicated with “judgment” calls such as roughing the passer or pass interference. If we apply this same paradigm to things such as presidential elections or the behavior of law enforcement, we find that humans not only have different perceptions of truth, but often times, polemic perceptions of truth. In such cases, can one version of truth be “more,” objectively true? Yes, but that doesn’t mean that objective truth is more compelling than a person’s deeply held, but wrong, perceptions. And if a person’s perception of the truth is also tied to identity and acceptance in a broader social group, then there may be no “evidence” that can alter the perception.
A Long and Partial List of Things That Conspire to Deceive Us
The list below includes over three dozen ways that we humans deceive ourselves, which fall into a few very high-level domains such as how we process information, how we protect identity and emotion, how memory and attention fail, and how social incentives distort judgment. You can see a detailed taxonomy in Appendix 1. Figure 1 provides some visual context for how these factors can be organized.

Figure 1 – Perceptual Biases
Fundamental Attribution Error
The tendency to overemphasize personality-based explanations for others’ behaviors while underestimating situational factors. When someone cuts you off in traffic, you assume they’re a jerk rather than considering they might be rushing to an emergency. FAE also includes attributing our own mistakes to external factors but other peoples’ mistakes to their own shortcomings.
Naïve Realism
The belief that we see reality objectively as it truly is, while others who disagree are uninformed, irrational, or biased. We assume our perceptions are accurate and that reasonable people should see things the same way we do.
Confirmation Bias
The tendency to search for, interpret, favor, and recall information that confirms our pre-existing beliefs while giving disproportionately less consideration to alternative possibilities.
Doubt Bias
When we are not confident in our own perspective, i.e., we doubt ourselves, we are susceptible to adopt someone else’s opinion or perspective even when it is not supported empirically, particularly if we have some affiliation with the other person and he or she projects a high level of confidence.
Intuition Bias
Working backwards from a gut feeling or conclusion to justify it, rather than examining evidence first and then forming a conclusion based on what the evidence shows.
Self-Referential Bias
The tendency to interpret events, information, and others’ behaviors in relation to what they mean for oneself, assuming things are about us or relevant to us when they may not be.
Cognitive Distortion
Systematic patterns of deviation from rational thinking, often involving exaggerated or irrational thought patterns such as all-or-nothing thinking, overgeneralization, or catastrophizing.
Availability Heuristic
Judging the likelihood or frequency of events based on how easily examples come to mind rather than on actual probability. We overestimate the risk of plane crashes because they’re memorable and widely reported.
Affect Heuristic
Making decisions based on emotional reactions rather than logical analysis. If something feels good, we judge it as having more benefits and fewer risks than it actually does.
Dunning Krueger Effect
A cognitive bias where people with limited knowledge or competence in a domain overestimate their ability, while experts tend to underestimate theirs. Incompetence often prevents people from recognizing their own incompetence.
Hindsight Bias
The “I knew it all along” phenomenon where after an event occurs, we believe we predicted or could have predicted the outcome, making the past seem more predictable than it actually was.
Sunk Cost Fallacy
Continuing to invest time, money, or effort into something because of what we’ve already invested, rather than evaluating whether continued investment makes sense based on future prospects.
Recency Bias
Giving disproportionate weight to recent events or information while discounting earlier data, assuming current trends will continue indefinitely.
Negativity Bias
The tendency to give more psychological weight to negative experiences, information, or emotions than to positive ones. Bad events affect us more strongly than equivalently good events.
Representative Heuristic
Judging the probability of something by how much it resembles our mental prototype, often ignoring actual statistical probabilities. Assuming someone is a librarian because they’re quiet and like books, despite librarians being statistically rare.
Motivated Reasoning
Processing information in a way that suits our goals or desires, unconsciously applying different standards of evidence depending on whether we want to believe a conclusion.
False Memory
Remembering events that didn’t happen or remembering them differently from how they actually occurred, often with high confidence in the inaccurate memory.
Predictive Processing/Narrative Bias (Making Meaning at the Expense of Being Right)
The brain’s tendency to prioritize creating coherent narratives and patterns over accuracy, filling in gaps and constructing explanations even when evidence is incomplete or contradictory.
Patternicity
The belief or expectation that what is actually random follows patterns, i.e., “bad things happen in threes.”
Mood State
Current emotional conditions that color our perceptions, memories, and judgments. When depressed, we more easily recall negative memories; when happy, we interpret ambiguous situations more positively.
Anchoring Bias
Over-relying on the first piece of information encountered (the “anchor”) when making decisions. If a shirt is marked down from $200 to $100, we perceive it as a great deal even if it’s only worth $50.
Identity-Protective Bias
Rejecting information or evidence that threatens our sense of self or group identity, even when the information is accurate, because accepting it would require uncomfortable changes to how we see ourselves.
Self-Interest Bias
The tendency to interpret situations in ways that favor our own interests, often unconsciously, while believing we’re being objective and fair.
Cognitive Dissonance Reduction
The mental discomfort of holding contradictory beliefs drives us to change our attitudes, beliefs, or behaviors to reduce the inconsistency, often by rationalizing away contradictions rather than changing core beliefs.
Inattentional Blindness (Bias)
Failing to notice unexpected stimuli in plain sight when our attention is focused elsewhere. In the famous study, people counting basketball passes completely miss a person in a gorilla suit walking through the scene.
Source Monitoring Errors
Confusion about where information came from—whether we read it, imagined it, heard it from someone, or dreamed it. We might remember a fact but misattribute its source or reliability.
Exaggeration Bias
Amplifying or distorting certain aspects of information to make an argument or viewpoint seem more compelling, often unconsciously stretching facts to fit the narrative we’re promoting.
Pluralistic Bias
Believing that our private attitudes and behaviors are different from others’ when they’re actually similar, or believing others hold different views than they actually do, leading to misperceptions about social norms.
Social Desirability Bias
The tendency to present ourselves in a favorable light and give answers or behave in ways that will be viewed positively by others, rather than responding honestly.
Status Quo Bias
Preferring things to stay the same or stick with previous decisions, even when change would be beneficial. The current state is seen as a baseline, and alternatives are judged by how they deviate from it.
Fear Bias
Allowing fear to disproportionately influence decision-making and risk assessment, often leading to overestimation of threats and overly cautious choices that may not serve our best interests.
Loss Aversion
The tendency to prefer avoiding losses over acquiring equivalent gains. Losing $100 feels worse psychologically than gaining $100 feels good, making us overly risk-averse.
Confidence Bias
Excessive certainty in our beliefs, judgments, or abilities that isn’t justified by our actual accuracy or knowledge. Being more confident than our competence warrants.
Dopamine Salience Errors
Misattributing importance or meaning to stimuli based on dopamine responses that signal reward or novelty, causing us to focus on or pursue things that trigger these responses rather than things that are genuinely important.
Framing Effect
Drawing different conclusions from the same information depending on how it’s presented. A medical treatment with a “90% survival rate” seems more appealing than one with a “10% mortality rate,” though they’re identical.
Metaphor Lock-in
Becoming trapped by the implications of a metaphor used to understand a concept, limiting our thinking to what the metaphor suggests while ignoring aspects of reality the metaphor doesn’t capture.
Complex Reality—Simplified Reasoning
The tendency to apply simple, linear cause-and-effect thinking to complex systems with multiple interacting variables, feedback loops, and emergent properties, leading to oversimplified conclusions.
Illusion of Control
Overestimating our ability to influence outcomes, especially in situations that are determined by chance. Believing we can control random events through rituals, strategies, or sheer willpower.
—
Why would deceiving ourselves be so prevalent? What’s the upside?
Most of these distortions aren’t “bugs.” They’re features optimized for:
- Speed over accuracy
- Social cohesion over truth
- Assuming risk over safety
- Survival over objectivity
- Meaning over randomness
- Comfort over dissonance
- Simplicity over complexity
Humans evolved over hundreds of thousands of years (millions of years of primate evolution) in a world of far less complexity than exists today. What had to be analyzed and understood included a very limited number of variables and equally limited stimuli relative to our lives today. Across populations, in a less complex environment, there is a survival advantage to things such as speed over accuracy or assuming risk over safety. In our contemporary environment, these processing and behavioral tendencies frequently result in distortions that may limit cognitive dissonance, but are not advantageous in other ways.
A Quick Note on Perceptual/Cognitive Distortions and Politics
Politics uniquely combines:
- Identity threat
- Moral judgment
- Group belonging (tribalism)
- Fear and loss
- High complexity with simplistic messaging
That’s basically a perfect storm for cognitive distortion. You can see a detailed explanation of how our susceptibility to deception manifests in our politics in Appendix 2.
Summary
Humans are inherently prone to self-deception through dozens of cognitive biases and processing errors spanning perception, memory, emotion, identity, and social influences. While everyone is vulnerable, susceptibility varies based on factors like education, cognitive style, personality traits, mental health, and tolerance for ambiguity. People with fundamentalist views, strong identity-based affiliations, or certain personality disorders are particularly vulnerable to circular thinking and rejecting contradictory evidence.
Key Protective Factors Against Self-Deception
- Intellectual humility + curiosity + willingness to engage with opposing views
- Comfort with being proven wrong
- Separating identity from being right
- Actively seeking disconfirming evidence
Major Categories of Perceptual/Cognitive Biases
- Information processing flaws: confirmation bias, availability heuristic, anchoring bias
- Identity/emotion protection: identity-protective bias, cognitive dissonance reduction, motivated reasoning
- Memory/attention failures: false memory, inattentional blindness, source monitoring errors
- Social distortions: social desirability bias, pluralistic bias, fundamental attribution error
Why Self-Deception Persists
These aren’t flaws but evolutionary features optimized for speed over accuracy, social cohesion over truth, survival over objectivity, simplicity over complexity, and comfort over dissonance. Human brains evolved to handle far simpler environments than today’s world, making us ill-equipped for modern complexity while still relying on these ancient shortcuts.
Appendix 1 – Taxonomy of Perception Bias Mechanisms
1. Attribution, Perspective & Social Interpretation Biases
Errors in how we explain causes or interpret others’ behavior.
- Fundamental Attribution Error — Over-attributing behavior to character instead of situation
- Naïve Realism — Believing we see reality objectively while others are biased
- Pluralistic Bias — Mistakenly assuming others privately disagree with their public behavior
- Illusion of Control — Overestimating influence over outcomes
- Self-Referential Bias — Over-weighting information related to oneself
2. Evidence Filtering, Belief Defense & Motivated Cognition
Biases that protect prior beliefs, identity, or emotional comfort.
- Confirmation Bias
- Motivated Reasoning
- Identity-Protective Bias
- Self-Interest Bias
- Cognitive Dissonance Reduction
- Finding Evidence for Intuition Rather Than Creating Intuition from Evidence
- Exaggeration to Support a Perspective
- Making Meaning at the Expense of Being Right (Narrative Bias / Predictive Processing)
- Confidence Bias
- Metaphor Lock-in
3. Heuristics & Mental Shortcuts (Fast but Distorting)
Cognitive efficiency tools that trade accuracy for speed.
- Availability Heuristic
- Affect Heuristic
- Representativeness Heuristic
- Anchoring Bias
- Framing Effect
- Recency Bias
- Status Quo Bias
- Sunk Cost Fallacy
- Loss Aversion
- Complex Reality — Simplified Reasoning
4. Emotion, Mood, Threat & Reward Distortions
Biases driven by affective state and motivational neurochemistry.
- Mood State Bias
- Negativity Bias
- Fear Bias
- Dopamine Salience Errors — Mis-tagging importance or meaning
- Affect Heuristic (also fits heuristics)
5. Self-Assessment, Overconfidence & Competence Illusions
Distortions in evaluating one’s own knowledge or ability.
- Dunning–Kruger Effect
- Confidence Bias
- Self-Interest Bias (also motivational)
6. Memory Construction & Retrospective Errors
Biases arising from reconstructive (not playback) memory.
- False Memory
- Source Monitoring Errors
- Hindsight Bias (listed twice; belongs here)
7. Attention, Perception & Awareness Failures
Limits in what we notice or encode.
- Inattentional Blindness
- Availability Heuristic (partly attention-driven)
8. Social Signaling & Reputation Management Biases
Distortions caused by group norms or impression management.
- Social Desirability Bias
- Pluralistic Bias
- Exaggeration to Support a Perspective
9. Cognitive & Narrative Distortions (Clinical / Broad Pattern Level)
Broader thinking styles that systematically skew interpretation.
- Cognitive Distortion (umbrella category)
- Doubt Bias (over-weighting uncertainty or undermining confidence)
- Making Meaning at the Expense of Being Right
- Complex Reality — Simplified Reasoning
10. Causality & Temporal Reasoning Errors
Misjudging cause-effect or time-based inference.
- Hindsight Bias
- Illusion of Control
- Sunk Cost Fallacy
Summary Map of Core Mechanisms
| Mechanism | Core Function |
| Belief defense | Protect identity & worldview |
| Heuristics | Save cognitive effort |
| Emotion regulation | Reduce fear, distress, or uncertainty |
| Social signaling | Maintain belonging & status |
| Memory reconstruction | Create coherent personal narrative |
| Attention limits | Reduce processing load |
| Narrative coherence | Make reality feel meaningful |
Appendix 2 – Perceptual Distortions and Politics
Politics uniquely combines:
- Identity threat
- Moral judgment
- Group belonging (tribalism)
- Fear and loss
- High complexity with simplistic messaging
That’s basically a perfect storm for cognitive distortion.
1. Identity-Protective Cognition
Core bias: Beliefs defend identity, not truth.
Politics
- Party affiliation becomes a social identity (not a policy preference).
- Evidence threatening “my side” is dismissed as biased or malicious.
Effect
- Facts don’t persuade; they polarize.
- Corrections increase certainty in false beliefs (backfire effect).
2. Motivated Reasoning
Core bias: Reasoning aimed at a preferred conclusion.
Politics
- Same behavior judged differently depending on who does it.
- Scandals are minimized or maximized based on alignment.
Effect
- Moral standards become elastic.
- Hypocrisy feels invisible from the inside.
3. Naïve Realism (Central in politics)
Core bias: “I see reality objectively; those who disagree are ignorant or evil.”
Politics
- Opponents are seen as stupid, brainwashed, or malicious.
- Compromise feels like surrender to irrationality.
Effect
- Dehumanization
- Zero-sum thinking
4. Confirmation Bias + Media Ecosystems
Core bias: Seek, trust, and remember confirmatory information.
Politics
- Algorithmic feeds create epistemic bubbles (echo chambers).
- Different groups live in different “realities.”
Effect
- No shared factual baseline.
- Political disagreement becomes ontological, not ideological.
5. Availability Heuristic
Core bias: Salient examples feel common.
Politics
- Rare events dominate discourse.
- Singular anecdotes override statistical reality.
Effect
- Policy driven by fear, not prevalence.
- Emotional resonance beats actuarial truth.
6. Fear Bias & Threat Prioritization
Core bias: The brain overweighs threats.
Politics
- Out-groups framed as existential dangers.
- “Emergency” rhetoric becomes constant.
Effect
- Authoritarian policies feel justified.
- Civil liberties become negotiable.
7. Loss Aversion
Core bias: Losses hurt more than gains help.
Politics
- Demographic or cultural change experienced as zero sum loss.
- Policies framed as “taking away” provoke outrage.
Effect
- Reactionary movements gain momentum.
- Status preservation beats innovation.
8. Status-Quo Bias
Core bias: Existing systems feel legitimate because they exist.
Politics
- Structural inequities seen as natural or earned.
- Radical reform feels dangerous regardless of evidence.
Effect
- Slow response to real crises.
- Incrementalism even when systems are failing.
9. System Justification Bias
Core bias: People defend systems that advantage them even at significant cost to others.
Politics
- Voters rationalize inequality as merit-based.
- “That’s just how things work” thinking.
Effect
- Collective action is dampened.
- Exploitation feels inevitable.
10. Illusion of Explanatory Depth
Core bias: We think we understand complex systems.
Politics
- Strong opinions on policies few could actually explain.
- Slogans replace mechanisms.
Effect
- Overconfidence in simplistic solutions.
- Policy debate becomes symbolic/ideological rather than technical.
11. Dunning–Kruger Effect
Core bias Low knowledge, but high confidence.
Politics:
- Loud certainty dominates discourse.
- Experts perceived as elitist or untrustworthy.
Effect
- Expertise loses authority.
- Performative confidence wins elections.
12. Narrative Fallacy
Core bias: Humans prefer stories to epistemic data.
Politics
- Single villains or heroes blamed for structural problems.
- Complex causal chains reduced to morality plays.
Effect
- Misdiagnosis of problems.
- Cycles repeat because root causes aren’t addressed.
13. Framing Effects
Core bias: Wording changes judgment.
Politics
- “Tax relief” vs. “public investment.”
- “Undocumented” vs. “illegal.”
Effect
- Language becomes a primary battleground.
- Policy outcomes hinge on semantics.
14. Pluralistic Ignorance
Core bias: People misjudge group norms.
Politics
- People privately doubt extreme views but assume others endorse them.
- Silence amplifies perceived consensus.
Effect
- Extremes appear mainstream.
- Moderates disengage.
15. Dopamine & Outrage Cycles
Core mechanism: Novelty + anger = salience and dopamine high.
Politics
- Outrage-driven content spreads faster than nuance.
- Platforms reward emotional intensity.
Effect
- Escalation spiral.
- Politics becomes addictive rather than deliberative.
One quietly disturbing insightis that most political disagreement is not about values; it’s about:
- Which facts feel credible
- Which threats feel real
- Which identities feel under siege
Reason alone can’t address that framework. In fact, reasoning is typically immediately rejected via several of the distortions identified above.
Appendix 3 – Biases Are Universal Across Left and Right
Their expression is asymmetric because:
- The left and right differ in core moral priorities
- They differ in threat sensitivity
- They differ in epistemic norms
- They differ in power/status position at a given moment
So asymmetry ≠ superiority; it means different failure modes.
1. Threat Sensitivity & Negativity Bias
Strong asymmetry (well-replicated)
Right (stronger)
- Higher physiological reactivity to threat, disgust, and norm violation
- Greater attentional bias toward danger, crime, outsiders
Left (weaker on physical threat, stronger on moral harm)
- Less sensitive to physical threat
- More sensitive to perceived harm, injustice, or exclusion
Political effects
- Right mobilizes around security, borders, order
- Left mobilizes around harm prevention, rights, equity
Failure modes
- Right: exaggerated threat perception → authoritarian drift
- Left: underestimating genuine security risks
2. Disgust Sensitivity & Moralization
Strong asymmetry
Right
- Higher disgust sensitivity (especially purity violations)
- Moral weight on sanctity, tradition, sexual norms
Left
- Lower disgust sensitivity
- Moralization shifts toward psychological and social harm
Failure modes
- Right: moral panic over norm-breaking
- Left: dismissal of others’ moral intuitions as mere bigotry
3. System Justification Bias
Contextual asymmetry (depends who holds power)
When right holds power
- Conservatives defend existing hierarchies as legitimate
- “That’s just how the system works”
When left holds institutional influence:
- Progressive norms become moralized and enforced
- Dissent framed as harmful, not merely incorrect
Failure modes
- Right: rationalizing inequality
- Left: suppressing heterodox views via moral pressure
4. Epistemic Style (How knowledge is validated)
Clear asymmetry
Right
- More reliance on:
- In-group trusted authorities
- Intuition and tradition
- Skepticism of credentialed expertise
Left
- More reliance on:
- Institutional expertise
- Scientific consensus
- Credentialed authority
Failure modes
- Right: vulnerability to charismatic misinformation
- Left: overconfidence in institutions, blind spots when institutions fail
5. Confirmation Bias & Media Sorting
Symmetric mechanism, asymmetric content
Right
- Fewer mainstream media outlets trusted
- More centralized alternative ecosystems
Left
- Wider media ecosystem but stronger norm enforcement within it
Failure modes
- Right: echo chambers with high misinformation density
- Left: illusion of consensus and moral certainty
6. Motivated Reasoning
Symmetric process, asymmetric triggers
Right
- Motivated to defend:
- National identity
- Hierarchy
- Tradition
Left
- Motivated to defend:
- Equality
- Marginalized groups
- Moral progress narratives
Failure modes
- Right: excusing abuses by authority figures
- Left: excusing excesses if framed as justice
7. Overconfidence vs. Moral Certainty
Asymmetric expression
Right
- Higher expressed certainty
- Confidence untethered from complexity
Left
- Lower expressed certainty but
- Higher moral certainty (“this is beyond debate”)
Failure modes
- Right: factual errors defended stubbornly
- Left: shutting down debate via moralization
8. Illusion of Explanatory Depth
Different targets
Right:
- Overconfidence in simple causal stories
- Underestimation of systemic complexity
Left:
- Overconfidence in systemic explanations
- Underestimation of tradeoffs and second-order effects
Failure modes
- Right: policy naïveté
- Left: policy overreach
9. Narrative Fallacy
Different archetypes
Right narratives
- Decline from a lost golden age
- Threatened in-group
- Moral decay
Left narratives
- Linear moral progress
- Villainous systems
- Inevitable justice
Failure modes
- Right: nostalgia distortions
- Left: historical inevitability blindness
10. Pluralistic Ignorance
Different silence dynamics
Right
- Underestimate how unpopular extreme views are
- Loud minority appears larger
Left
- Moderates self-censor due to moral policing
- Silence interpreted as agreement
Failure modes
- Right: normalization of extremism
- Left: hollow consensus
The Big Asymmetry People Miss
Right-wing errors skew toward false positives
(seeing threats that aren’t there)
Left-wing errors skew toward false negatives
(missing risks, tradeoffs, or unintended harm)
This maps cleanly onto:
- Evolutionary threat detection
- Moral foundations theory
- Predictive processing models
One Uncomfortable Truth for Each Side
For the Right
Fear feels like realism, but it often isn’t.
For the Left:
Moral certainty feels like progress, but it often blocks correction.
Both are convinced they’re the adults in the room.
A Final Note
Although all humans are vulnerable to self-deception, social science research suggests that folks aligned with political extremes are more likely to experience perceptional distortions and come to conclusions that are not empirically supported than are mainstream individuals on both the right and left because of:
- Higher levels of misinformation in their social and news media ecosystems (echo chambers)
- Greater threat perception (fear of: change, the other, “hidden” forces)
- More intense group identity (group acceptance/belonging is more important than accuracy)
- More amendable to “ends justify the means,” which obviates the need to follow norms, laws, or avoid collateral damage
- A need for “cognitive closure” which results in dismissing information that challenges preconceptions.
- Susceptibility to patternicity (seeing random events as evidence of a preconception)
- Seeing complexity as a threat and gravitating to simplicity even when it cannot explain reality (this is more common on the extreme right)
- Moral conviction as a self-justifying force (also more common on the extreme right)

