Stuart Russell
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
think that there's a significant risk of extinction.
Almost all the leading AI researchers think there's a significant risk of human extinction.
So why is that the fringe, right?
Why isn't that the mainstream?
If these are the leading experts in industry and academia saying this, how could it be the fringe?
So we're trying to change that narrative
to say, no, the people who really understand this stuff are extremely concerned.
What I think is that we should have effective regulation.
It's hard to argue with that, right?
So what does effective mean?
It means that if you comply with the regulation, then the risks are reduced to an acceptable level.
So for example, we ask people who want to operate nuclear plants, right?
We've decided that the risk we're willing to live with is, you know, a one in a million chance per year that the plant is going to have a meltdown.
Any higher than that, you know, we just don't, it's not worth it, right?
So you have to be below that.
Some cases we can get down to one in 10 million chance per year.
So what chance do you think we should be willing to live with for human extinction?
Me?
Yeah.
Right?