Today I'm sharing my interview on Robert Wright's Nonzero Podcast where we unpack Eliezer Yudkowsky's AI doom arguments from his bestselling book, "If Anyone Builds It, Everyone Dies." Bob is an exceptionally thoughtful interviewer who asks sharp questions and pushes me to defend the Yudkowskian position, leading to a rich exploration of the AI doom perspective. I highly recommend getting a premium subscription to his podcast: 0:00 Episode Preview 2:43 Being a "Stochastic Parrot" for Eliezer Yudkowsky 5:38 Yudkowsky's Book: "If Anyone Builds It, Everyone Dies" 9:38 AI Has NEVER Been Aligned 12:46 Liron Explains "Intellidynamics" 15:05 Natural Selection Leads to Maladaptive Behaviors ā AI Misalignment Foreshadowing 29:02 We Summon AI Without Knowing How to Tame It 32:03 The "First Try" Problem of AI Alignment 37:00 Headroom Above Human Capability 40:37 The PauseAI Movement: The Silent Majority 47:35 Going into Overtime Get full access to Doom Debates at lironshapira.substack.com/subscribe
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
3ĀŖ PARTE | 17 DIC 2025 | EL PARTIDAZO DE COPE
01 Jan 1970
El Partidazo de COPE
13:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
10:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
13:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana