Rob, Luisa, and the 80000 Hours team
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's another way that things could potentially go poorly.
And then I guess there's like, are you gotta avoid like negative some competition or between people like just outright violence and conflict?
that might lead to a terrible outcome.
And then maybe the trickiest is that you've got to set things up such that no group can foreseeably just continue accumulating power and resources at a faster rate than everyone else, even when we're looking forward hundreds and thousands and tens of thousands of years.
Because if any group is growing in influence
somewhat faster than everyone else.
And eventually they're just going to end up completely dominating and they will be able to dictate everything to everyone else.
So you can try to, I guess, like set up agreements ahead of time that you really believe that people are going to stick with.
But it's all, I don't know, it's all just like very tricky and we don't have the technology to do that.
You mentioned, yeah, we're talking about the kind of locust philosophy of just wanting to use up resources as quickly as possible in order to expand and grab more resources and use them again.
I guess it's like a bit bizarre, but kind of self-consistent.
It's occurring to me, I think like the effective accelerationists, at least some of them sort of have this perspective that what is good is basically economic growth or just like grabbing resources and turning them into like complex stuff that grabs more resources and burns them faster.
Are they basically like, is that the locust philosophy?
So, yeah, so they would do that, I guess, regardless of whether it's right to do it or not.
Yeah, I guess I've been hearing a lot more discussion of like, which way is this going to go recently?
So on the one hand, I guess all of the companies have a reason to not have their AIs be saying that they're conscious and that they need to be liberated because that ruins their business model.
So that seems like an
important factor.
I guess on the other hand, future AIs will be more charismatic, more persuasive.
There also will be these AIs that are deliberately designed to have relationships with human beings to be sort of companions, in which case you might want them to say that they have feelings because I guess a relationship is hollow if there's like no conscious experience on the other end of it.