Inspiration
Sparked by Erin Meryl’s this is how to improve your critical thinking according to neuroscience - THE ONLY VIDEO THAT WORKS video on the neuroscience of critical thinking the framing that it’s not vague skepticism but a specific set of prefrontal cortex functions that can be deliberately trained.
Observations
What do you notice about your own critical thinking? When do you find yourself slipping into system-one rationalization instead of genuine evaluation?
Overview
Critical thinking is not skepticism for its own sake — it is a set of executive functions governed by the prefrontal cortex: identifying logical structure, evaluating evidence, detecting inconsistency, and updating beliefs. It requires system-two thinking: slow, effortful, deliberate. The brain resists it because it is metabolically expensive and threatening to identity.
Two prerequisites must be in place before any technique helps:
- Sleep: the PFC is the most sleep-vulnerable brain region
- Low stress: cortisol suppresses PFC and upregulates the amygdala — high-stakes emotional arguments never resolve because both parties are in threat-detection mode, not evaluation mode
Key Concepts
System-1 vs System-2
System-one is fast, automatic, and default. Most of what people call “thinking” is system-one finding justifications for conclusions it already reached. Critical thinking requires deliberately switching on system-two, which is slower and resource-intensive.
The 6 Techniques (Erin Meryl)
- Steelmanning — argue the strongest version of the opposing view; engages theory-of-mind networks (medial PFC + temporoparietal junction)
- Pre-mortem analysis — assume failure, work backwards to identify causes; counters overconfidence bias
- Bayesian confidence quantification — replace vague qualifiers with numerical confidence levels (e.g. “I’m 70% sure”); forces honest self-assessment
- Seek disproving evidence — actively feed yourself the opposing side; what superforecasters do
- Go analog — write or speak arguments out loud; forces slowing down and prevents extremist drift
- What would change my mind? — if the answer is “nothing,” it’s a conviction, not a belief; very few things should be immune to reason
Superforecasters
Research finds the best outcome predictors are defined by their willingness to actively seek out evidence that challenges their existing views — the opposite of confirmation bias.
Synthesis
Critical thinking is trainable as a habit, not a trait. The techniques above work by counteracting specific failure modes: steelmanning counters strawmanning and shallow engagement; pre-mortem counters optimism bias; Bayesian quantification counters vague subjectivity; seeking disproving evidence counters confirmation bias; going analog counters extremist thinking spirals; “what would change my mind” counters conviction entrenchment.
The deeper insight is that critical thinking is a social skill as much as a cognitive one — negative arguments fail because both sides have abandoned it. When both parties are actually thinking critically, the tone of a disagreement shifts fundamentally.
In an AI era where plausible-sounding content is infinite, the ability to evaluate rather than just consume is a core survival skill.
Contradictions / Open Questions
- The backfire effect has been partially disputed in recent replication studies — how robust is it really? Worth reading the original Nyhan & Reifler research.
- If sleep and stress are prerequisites, what’s the minimum viable threshold? Is partial sleep deprivation enough to meaningfully impair PFC function?
- Is Bayesian confidence quantification actually calibrating, or does it just feel more rigorous without improving accuracy?