Rob, Luisa, and the 80000 Hours team
๐ค SpeakerAppearances Over Time
Podcast Appearances
or one of the like failure modes possibly is that I think we do tend to think in these extremes where it's like it's very hard to think either it's going to be like the maximally like hard scrap or competition where all of the surplus is burned away or we're going to have like a perfect hegemon in which everything is divided and nothing is wasted.
Do you think that there are middle grounds that are
stable equilibrium long-term?
Or maybe people are correct in thinking, well, actually the middle ground, it just, you know, there's a gravity well towards like intense maximal competition or towards like maximal coordination because those kind of just tend to persist.
So I think you have to organize a conference a couple of weeks back.
The title was like, are there good post-AGI social equilibria or something like that?
Yeah.
It may not be a good idea, but at least there's an idea.
Yeah.
Is there a way of summing up?
So maybe by this point in the conversation, people have some sense, but why is it hard to come up with a good post-AGI equilibria?
I guess in my mind, this is like many different failures or like many different like bad directions that you have to avoid.
And like avoiding all of them simultaneously is really quite a difficult challenge to meet.
I guess in my mind, the things that we're trying to navigate between are a situation in which humans end up having no control quite early.
a situation in which they dominate and treat poorly machines and AIs in the future.
Some people will think it's very bad.
Some people might not think it's such a problem or they don't think that people would do it.
But that's a possibility.
Then I guess there's locking in, like current...
kind of idiosyncratic values and ideas that we have such that we can't kind of intellectually advance and reflect and realize that some of our ideas are mistaken even by our own lights.