Stuart Russell
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
said that creating superhuman intelligence is the biggest risk to human existence that there is.
You know, Elon Musk is also on record saying this.
So Dario Amadei estimates up to a 25% risk of extinction.
I mean, they all signed a statement in May of 23.
It's called the extinction statement.
It basically says AGI is an extinction risk at the same level as nuclear war and pandemics.
But I don't think they feel it in their gut.
Imagine that you are one of the nuclear physicists.
I guess you've seen Oppenheimer, right?
So you're there, you're watching that first nuclear explosion.
how would that make you feel about the potential impact of nuclear war on the human race, right?
I think you would probably become a pacifist and say, this weapon is so terrible.
We have got to find a way to keep it under control.
We are not there yet with the people making these decisions and certainly not with the governments, right?
You know,
What policymakers do is they listen to experts.
They keep their finger in the wind.
You've got some experts dangling $50 billion checks and saying, oh, all that Duma stuff, it's just fringe nonsense.
Don't worry about it.
Take my $50 billion check.