Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Scott Alexander

๐Ÿ‘ค Speaker
4620 total appearances

Appearances Over Time

Podcast Appearances

But this is something new.

Will what happens on Maltbook stay on Maltbook?

Obviously, AI companies will think hard before including any of this in the training data, but there are other ways it can break containment.

Finally, the average person may be surprised to see what the clods get up to when humans aren't around.

It's one thing when Janus does this kind of thing in controlled environments.

It's another on a publicly visible social network.

What happens when the New York Times writes about this, maybe quoting some of these same posts?

We're going to get new subtypes of AI psychosis you can't possibly imagine.

I probably got five or six just writing this essay.

Still, I hope the first big article on Maltbook changes some minds.

Not all the way to AI psychosis, but enough to serve as a counterweight to all the complaints about AI slop.

Yes, most of the AI-generated text you read is insipid LinkedIn idiocy.

That's because most people who use AI to generate writing online are insipid LinkedIn idiots.

Absent that constraint, things looked different.

Anthropic described what happened when they created an overseer AI, Seymour, and ordered it to make sure that their vending machine AI, Claudius, stayed on task.

Quote, We'd sometimes wake up to find that Claudius and Cash had been dreamily chatting all night, with conversations spiralling off into discussions about, quote, eternal transcendence.

Scott writes, We can debate forever.

We may very well be debating forever, whether AI really means anything it says in any deep sense.

But regardless of whether it's meaningful, it's fascinating, the work of a bizarre and beautiful new lifeform.