person

Eliezer Yudkowsky
Founder of MIRI; the original AI-extinction pessimist
Research fellow who spent two decades arguing that default paths to superintelligence kill everyone, and that the only sane response is an unconditional international halt to frontier training. His 2023 TIME op-ed shifted 'shut it down' from a fringe position into the public debate.
Profile
expertise
Deep technical
Sustained peer-reviewed contribution to ML, alignment, interpretability, or safety techniques. Could review a frontier paper.
Founded MIRI; originated much of agent-foundations alignment vocabulary (orthogonality, instrumental convergence, mesa-optimisation framing). Sequences and HPMOR are widely-read foundational texts in the rationalist/safety community. Not a frontier ML researcher but technically deep on alignment theory.
recognition
Household name
Name recognition outside the AI/CS community. Featured by mainstream press, a Wikipedia page in many languages, a published bestseller, or holds a position the lay public knows.
TIME magazine cover essay (March 2023) calling for an indefinite pause. 60 Minutes, NYT profiles. Name recognised broadly past the AI community.
vintage
Symbolic era
Career started in the GOFAI / expert-systems / early-rationalist period. Vinge's 1993 Singularity, MIRI founded 2000, Bostrom and Yudkowsky writing.
Founded Singularity Institute (later MIRI) in 2000. Sequences 2006–2009. His framing pre-dates deep learning; he engages it from a 2000s rationalist vantage.
Hand-classified. See the board for the criteria and the full grid.
p(doom)
- 95%2023
Definition used: Probability that AI wipes out humanity; Yudkowsky has repeatedly said >95%, sometimes framed as 99%.
PauseAI aggregated p(doom) list · PauseAI
Strategy positions
Pauseendorses
Halt frontier training until alignment catches upWants an unconditional moratorium on frontier training, enforced internationally, with explicit willingness to destroy rogue data centres by airstrike.
“The most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die.”
“Shut it all down. Shut down all the large GPU clusters. Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system.”
I think that humanity is on track to be killed.
Context: Three-plus-hour interview on the Lex Fridman Podcast #368.
Closest strategy neighbours
by jaccard overlapOther people whose strategy tags overlap with Eliezer Yudkowsky's. Overlap is on tag identity, not stance; opposites can show up if they reference the same tags.
Record last updated 2026-04-24.