David Duvenaud
๐ค SpeakerAppearances Over Time
Podcast Appearances
Probably it's not a stable situation for the government to allow the AIs to really effectively organize against that government.
So there's a bunch of reasons to expect that, you know, the government might have AIs that just do whatever they say and everyone else is going to have the like hobbled civilian versions that aren't actually allowed to be totally aligned to them.
Yeah, exactly.
And the sort of fastest growing, most growth oriented institutions in this world, like governments and corporations, are going to have an interest in sort of marginalizing humans to some extent.
Because humans, from their point of view, will be these like meddlesome parasites.
So you can imagine that there's like humans that are advocating for like, you know, we like legacy humans, we legacy beings deserve like some huge fraction of like GDP or at least like some very expensive product.
protection of our interests at the expense of like maybe some new flourishing society of AIs or weird AI human hybrids or whatever is sort of most like memetically politically fit.
So you can imagine sort of a response by the sort of more growth oriented sort of machine parts of society.
They might end up
making the case that this is like a speciesist sort of demand and that we can't have this sort of narrow-minded policy setting.
And it could be de facto illegal to advocate for speciesist policies or something like that.
Yeah, well, I think what's going to happen, like what's most likely to happen in real life is that we have some sort of gradual disempowerment like happening.
And it's sort of happening in a few small ways already that maybe happens for a while until maybe much of the military is automated or people have like much less connection to the organs of the state.
And then probably there'll be some more like classic fast runaway loss of power, like a coup or some new weird like quasi cartel government that just somehow like takes over in some way that we don't really expect.
So I think that like,
weakening our connection to the organs of the state and just understanding what's going on is going to be one of the precursors for some faster loss of control.
The point we wanted to make in the paper was that even if there's no fast loss of control, we still might end up having similar loss of control just through the normal business as usual dynamics.
Sure.
I mean, the basic idea is, for similar reasons that we would have less control over the government, also...
the government would have a harder time avoiding being cued by the military if there aren't a bunch of human soldiers in the loop.