George Church
๐ค SpeakerAppearances Over Time
Podcast Appearances
This is not like COVID-19 where we actually, millions of people were dying if we delayed the science.
This is something where if there ever is a crisis, it's because we created it.
It's not because we're trying to solve it.
And so I think we need to go very slowly on AGI and ASI and double down on
slightly narrower scientific goals.
And even that, we need to be very cautious about.
We need to have kind of an international consensus on what constitutes safe AI.
I think we'd slow it down.
I think
I think it would eliminate it because the first thing it would conclude is biology is not relevant to me because I'm not made out of biology.
I don't think we have anything close to the assurance that we need that that would be safe.
But let's put safety aside for a moment.
I think it's not only hard to calculate the bads, it's hard to calculate the goods.
So I think it could be a complete game changer.
But on the other hand,
It's like if we said we could get instantaneous transport all over the earth, right?
Well, we could say, yes, that could be a game changer, but do we really need it, right?
Is that really important?
Maybe it would be more interesting to just have Zoom calls and they're better, you know, or just learn how to get everything we want in our kitchen and we don't need to travel anymore, right?
So, you know, be careful what you ask for, right?