Machine Alignment Monday, 7/24/23 Intelligence explosion arguments don't require Platonism. They just require intelligence to exist in the normal fuzzy way that all concepts exist. First, I'll describe what the normal way concepts exist is. I'll have succeeded if I convince you that claims using the word "intelligence" are coherent and potentially true. Second, I'll argue, based on humans and animals, that these coherent-and-potentially-true things are actually true. Third, I'll argue that so far this has been the most fruitful way to think about AI, and people who try to think about it differently make worse AIs. Finally, I'll argue this is sufficient for ideas of "intelligence explosion" to be coherent. https://astralcodexten.substack.com/p/were-not-platonists-weve-just-learned Ā
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other episodes from Astral Codex Ten Podcast
Transcribed and ready to explore now
Your Review: Joan of Arc
07 Aug 2025
Astral Codex Ten Podcast
Book Review: Selfish Reasons To Have More Kids
03 Jun 2025
Astral Codex Ten Podcast
Links For February 2025
11 Mar 2025
Astral Codex Ten Podcast
The Emotional Support Animal Racket
28 May 2024
Astral Codex Ten Podcast
The Psychopolitics Of Trauma
27 Jan 2024
Astral Codex Ten Podcast
Book Review: A Clinical Introduction To Lacanian Psychoanalysis
27 Apr 2022
Astral Codex Ten Podcast