Stuart Russell
๐ค SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
But the whole notion that you have stage one, stage two, stage three, and here are the criteria for what you have to do to pass a stage one trial, right?
We haven't even thought about what those would be for algorithms.
So, I mean, I think there are...
There are things we could do right now with regard to bias, for example.
We have a pretty good technical handle on how to detect algorithms that are propagating bias that exists in data sets, how to de-bias those algorithms, and even what it's going to cost you to do that.
So I think we could start having some standards on that.
I think there are things to do with impersonation and falsification that we could work on.
A very simple point.
So impersonation is a machine acting as if it was a person.
I can't see a real justification for
for why we shouldn't insist that machines self-identify as machines.
Where is the social benefit in fooling people into thinking that this is really a person when it isn't?
I don't mind if it uses a human-like voice that's easy to understand, that's fine, but it should just say, I'm a machine in some form.
Yeah, I mean, there is actually a law in California that bans impersonation, but only in certain restricted circumstances.
So for the purpose of engaging in a fraudulent transaction and for the purpose of modifying someone's voting behavior.
So those are the circumstances where machines have to self-identify.
But I think arguably it should be in all circumstances.
And then when you talk about deep fakes, we're just at the beginning, but already it's possible to make a movie of anybody saying anything in ways that are pretty hard to detect.