Tristan Harris
👤 PersonAppearances Over Time
Podcast Appearances
The alternative, people say, oh, no, no.
Then we have to have this sort of ministry of truth, censorship, content moderation that is aggressively looking at the content of everyone's posts.
And then there's no appealing process.
And that's the dystopia for social media.
Plus the fact that these companies are making crazy amounts of money and getting exponentially more powerful.
And the power of society is not going up relative to Facebook or TikTok or whatever.
Yeah.
So those are the chaos dystopia for social media.
The narrow path is how do you design an information environment in a social information environment where, for example, instead of everybody getting infinite reach, you have reach that's more proportional to the amount of responsibility that you're holding so that the power of reaching a lot of people should be matched with the responsibility that goes with reaching a lot of people.
How do you enact that in ways that don't
create dystopia themselves, who's setting the rules of that.
It's a whole other conversation, but I think it's setting out the principles by which you think about power and responsibility being loaded into society.
Yeah.
We probably have two years till AGI.
What I hear, and we're based in Silicon Valley, and this is generally not even private knowledge, but even when I hear it privately in settings in San Francisco, we're about two years from artificial general intelligence, which means basically this is what they believe, that you would be able to swap in a human remote worker that's doing things and you swap in an AI system.
That's probably not going to be true for fully complex tasks.
There's some recent research out from a group called Meter that measures how long of a task can an AI system do.
So can they do a task that's a 10-minute task?
Can they do a task that's a three-hour task?
And what they found is that the length of a task that an AI system can do doubles every seven months.