David Duvenaud
๐ค SpeakerAppearances Over Time
Podcast Appearances
like moral progress by like difference from our current moral standards, then if you go backwards in time, the further back you go, sort of the worst things get sort of monotonically.
And so if you just extrapolate that, it looks like, oh, so if we go forward in time and we continue allowing this sort of moral progress to happen, things are going to get better.
But I think what's actually happening is we're just measuring difference from our current values.
So
we should actually see this lion have a kink exactly at the present day and things just to continue to, however they evolve, however moral standards evolve, to look worse and alien and wrong from our current point of view.
So I think from the point of view of future beings, whoever is alive is going to be glad that the past wasn't able to lock in their different values, but they're basically going to be happy that they're in power.
So in a sense, we don't have to worry about the future beings.
They're kind of one no matter what.
The only thing that remains to worry about is locking in our current values in some sense.
And of course, we probably don't want to lock in all these short-term, maybe irrelevant details or local adaptations.
We probably want to lock them in at some point
larger sort of more abstract like big-minded view but i think if we do by definition want value lock-in because if you're okay with your value changing it's not really clear in what sense it's actually a value of yours
I totally agree.
But I guess in terms of what we should do, I think it doesn't end up mattering because even if we're moral realists, the question is, do the AI successors that we build care about that morality?
And I think the default is, no, they care about growth just in the way that animals don't care about morality and evolution doesn't care about morality.
And so if we happen to somehow today care about the true morality, we need to preserve that.
And the sort of natural course of history, I don't think it's going to preserve it by default.
That makes sense.
And maybe I would unfairly characterize that as a corner case where it happens to be that taking our hands off the wheel is sort of the morally best action, which I feel like is a bit of a coincidence or it would be a sort of happy coincidence just because...
We already evolved civilizations that, for instance, are making AIs that maybe don't have rights or that used to have slavery or all sorts of oppressive governments sort of arise naturally throughout time.