https://astralcodexten.substack.com/p/highlights-from-the-comments-on-acemoglu Eugene Norman writes: This… "People have said climate change could cause mass famine and global instability by 2100. But actually, climate change is contributing to hurricanes and wildfires right now! So obviously those alarmists are wrong and nobody needs to worry about future famine and global instability at all." …isn't a good analogy at all. Because nobody is arguing that climate change now doesn't lead to increased climate change in the future. They are the same thing but accelerated. However there's no certainty that narrow AI leads you a super intelligence. In fact it won't. There's no becoming self aware in the algorithms. I'm against this for two reasons. First, self-awareness is spooky. I honestly have no idea what self-awareness is or what it even potentially could be. I hate having this disc
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other episodes from Astral Codex Ten Podcast
Transcribed and ready to explore now
Your Review: Joan of Arc
07 Aug 2025
Astral Codex Ten Podcast
Book Review: Selfish Reasons To Have More Kids
03 Jun 2025
Astral Codex Ten Podcast
Links For February 2025
11 Mar 2025
Astral Codex Ten Podcast
The Emotional Support Animal Racket
28 May 2024
Astral Codex Ten Podcast
The Psychopolitics Of Trauma
27 Jan 2024
Astral Codex Ten Podcast
Book Review: A Clinical Introduction To Lacanian Psychoanalysis
27 Apr 2022
Astral Codex Ten Podcast