Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Scott Alexander (Astral Codex Ten)

๐Ÿ‘ค Speaker
456 total appearances

Appearances Over Time

Podcast Appearances

Astral Codex Ten Podcast
Links For February 2025

The subreddit discusses career planning in a post-GPT world.

Astral Codex Ten Podcast
Links For February 2025

L. Rudolph L., author of the post on capital slash labor in the singularity that I discussed here, link in post, has a proposed history of the future scenario, three links here, tracking what he thinks will happen from now to 2040.

Astral Codex Ten Podcast
Links For February 2025

Extremely slow takeoff, assumes alignment will be solved, etc.

Astral Codex Ten Podcast
Links For February 2025

I want to challenge some of these assumptions, but will wait until a different scenario I'm waiting on gets published.

Astral Codex Ten Podcast
Links For February 2025

The part I found most interesting here is Rudolph's suggestion that there will be neither universal unemployment nor UBI, but a sort of vapid jobs program where even after AI can make all decisions without human input, the government passes regulations mandating that humans be, quote, in the loop, using safety as a fig leaf.

Astral Codex Ten Podcast
Links For February 2025

And we get a world where everyone works 40-hour weeks attending useless meetings where everyone tells each other what the AIs did, then rubber stamps it.

Astral Codex Ten Podcast
Links For February 2025

Sort of like longshoremen hereditary fiefdoms that were in the news last year.

Astral Codex Ten Podcast
Links For February 2025

Boaz Barak, a friend of Scott Aronson's now working on OpenAI alignment team, has six thoughts on AI safety.

Astral Codex Ten Podcast
Links For February 2025

It's all pretty moderate and thoughtful stuff.

Astral Codex Ten Podcast
Links For February 2025

What I find interesting about it is that the acknowledgements say Sam Altman provided feedback, although, quote, does not necessarily endorse any of its views, end quote.

Astral Codex Ten Podcast
Links For February 2025

I think this is a useful window into OpenAI's current alignment thinking, or at least into the fact that they currently have alignment thinking.

Astral Codex Ten Podcast
Links For February 2025

Not much to complain about in terms of specifics, and glad people like Boaz are involved.

Astral Codex Ten Podcast
Links For February 2025

If you ask Grok3, quote, who is the worst spreader of misinformation, it will say Elon.

Astral Codex Ten Podcast
Links For February 2025

If you ask it who deserves the death penalty, it will say Trump, with Elon close behind.