Neil I. Patel
๐ค SpeakerAppearances Over Time
Podcast Appearances
And the answer is that Intel and Apple and Qualcomm and Nvidia and AMD and every other chip maker have to prevent it somehow at the hardware level, which seems impossible. The only example I can think of where we have allowed that to happen is that Adobe Photoshop won't allow you to scan and print a dollar bill
which makes sense, like it broadly makes sense that Adobe made that deal with the government. But it's also like, well, that's about as far as you should let that go, right? Like there's a point where you wanna make a parody image of a Biden or a Trump, and you don't want Photoshop saying, hey, are you manipulating a real person's face? Like you're saying, that seems way too far.
which makes sense, like it broadly makes sense that Adobe made that deal with the government. But it's also like, well, that's about as far as you should let that go, right? Like there's a point where you wanna make a parody image of a Biden or a Trump, and you don't want Photoshop saying, hey, are you manipulating a real person's face? Like you're saying, that seems way too far.
which makes sense, like it broadly makes sense that Adobe made that deal with the government. But it's also like, well, that's about as far as you should let that go, right? Like there's a point where you wanna make a parody image of a Biden or a Trump, and you don't want Photoshop saying, hey, are you manipulating a real person's face? Like you're saying, that seems way too far.
So a total ban seems implausible. There are other things you could do at the creation step. OpenAI bans certain prompts that violates their terms of service. Getty won't let you talk about celebrities at all. If you type a celebrity's name or basically any proper noun into the Getty image generator, it just tells you to go away.
So a total ban seems implausible. There are other things you could do at the creation step. OpenAI bans certain prompts that violates their terms of service. Getty won't let you talk about celebrities at all. If you type a celebrity's name or basically any proper noun into the Getty image generator, it just tells you to go away.
So a total ban seems implausible. There are other things you could do at the creation step. OpenAI bans certain prompts that violates their terms of service. Getty won't let you talk about celebrities at all. If you type a celebrity's name or basically any proper noun into the Getty image generator, it just tells you to go away.
There's a lot of conversation about watermarking this stuff and making sure that real images have a watermark that say they're real images and AI images have a watermark that say they're AI images. Do any of those seem promising?
There's a lot of conversation about watermarking this stuff and making sure that real images have a watermark that say they're real images and AI images have a watermark that say they're AI images. Do any of those seem promising?
There's a lot of conversation about watermarking this stuff and making sure that real images have a watermark that say they're real images and AI images have a watermark that say they're AI images. Do any of those seem promising?
The part where you restrict the prompts. OpenAI restricts the prompts, Getty restricts the prompts. It's pretty easy to get around that, right? The Taylor Swift deep fakes that were floating around on Twitter, they were made in a Microsoft tool and Microsoft just had to get rid of the prompts. Is that just a forever cat and mouse game on the restrict the prompts idea?
The part where you restrict the prompts. OpenAI restricts the prompts, Getty restricts the prompts. It's pretty easy to get around that, right? The Taylor Swift deep fakes that were floating around on Twitter, they were made in a Microsoft tool and Microsoft just had to get rid of the prompts. Is that just a forever cat and mouse game on the restrict the prompts idea?
The part where you restrict the prompts. OpenAI restricts the prompts, Getty restricts the prompts. It's pretty easy to get around that, right? The Taylor Swift deep fakes that were floating around on Twitter, they were made in a Microsoft tool and Microsoft just had to get rid of the prompts. Is that just a forever cat and mouse game on the restrict the prompts idea?
That's the creation side. We need to take a quick break. When we come back, we'll get into the harder problem, distribution.
That's the creation side. We need to take a quick break. When we come back, we'll get into the harder problem, distribution.
That's the creation side. We need to take a quick break. When we come back, we'll get into the harder problem, distribution.
Welcome back. So we've talked about what the companies that make software and hardware can do about the creation of deepfakes. And it seems like the best answer we have right now is adding watermarks to AI generated content. But the real problems are in distribution. Let's talk about the distribution side, which is, I think, where the real problem lies.
Welcome back. So we've talked about what the companies that make software and hardware can do about the creation of deepfakes. And it seems like the best answer we have right now is adding watermarks to AI generated content. But the real problems are in distribution. Let's talk about the distribution side, which is, I think, where the real problem lies.
Welcome back. So we've talked about what the companies that make software and hardware can do about the creation of deepfakes. And it seems like the best answer we have right now is adding watermarks to AI generated content. But the real problems are in distribution. Let's talk about the distribution side, which is, I think, where the real problem lies.
If you make a bunch of deepfakes at your house with Donald Trump and you never share them with anyone, what harm have you caused? You start telling lies about both presidential candidates and you share them widely on social platforms that go viral. Now you have caused a giant external problem. And so it feels like the pressure to regulate this stuff is going to come back. to the platforms.