Adi Robertson
👤 PersonAppearances Over Time
Podcast Appearances
The most promising argument I've heard for these is the idea that you can – and this is an argument that Adobe has made to me – train people to expect a watermark. And so if what you're saying is we want to make it impossible to make these images without a watermark, I think that raises the same problems that we just talked about, which if anyone can make –
The most promising argument I've heard for these is the idea that you can – and this is an argument that Adobe has made to me – train people to expect a watermark. And so if what you're saying is we want to make it impossible to make these images without a watermark, I think that raises the same problems that we just talked about, which if anyone can make –
tweaked version of an open source tool, they can just say, don't put a watermark in. But I think that you could potentially get into a situation where you require a watermark. And if something doesn't have a watermark, there are ways that its design or its spread or people trusting it are severely hobbled. That's maybe the best argument for it, I've heard.
tweaked version of an open source tool, they can just say, don't put a watermark in. But I think that you could potentially get into a situation where you require a watermark. And if something doesn't have a watermark, there are ways that its design or its spread or people trusting it are severely hobbled. That's maybe the best argument for it, I've heard.
tweaked version of an open source tool, they can just say, don't put a watermark in. But I think that you could potentially get into a situation where you require a watermark. And if something doesn't have a watermark, there are ways that its design or its spread or people trusting it are severely hobbled. That's maybe the best argument for it, I've heard.
It does seem like the thing about a lot of generative AI tools is that there are just vast, vast numbers of ways to get them to do something. People are going to find those. Software bugs are a thing that has been a problem. Zero-day exploits have been a problem on computers for a very long time. And this feels like it kind of falls into that category.
It does seem like the thing about a lot of generative AI tools is that there are just vast, vast numbers of ways to get them to do something. People are going to find those. Software bugs are a thing that has been a problem. Zero-day exploits have been a problem on computers for a very long time. And this feels like it kind of falls into that category.
It does seem like the thing about a lot of generative AI tools is that there are just vast, vast numbers of ways to get them to do something. People are going to find those. Software bugs are a thing that has been a problem. Zero-day exploits have been a problem on computers for a very long time. And this feels like it kind of falls into that category.
So far, it feels like the consensus is we're going to label this and that's going to be mainly our job is that we're going to try to make sure we catch it. There are cases where, say, maybe you get it taken down if you haven't disclosed if you're a company or you're buying a political ad.
So far, it feels like the consensus is we're going to label this and that's going to be mainly our job is that we're going to try to make sure we catch it. There are cases where, say, maybe you get it taken down if you haven't disclosed if you're a company or you're buying a political ad.
So far, it feels like the consensus is we're going to label this and that's going to be mainly our job is that we're going to try to make sure we catch it. There are cases where, say, maybe you get it taken down if you haven't disclosed if you're a company or you're buying a political ad.
But broadly, the idea seems to be we want to give people information and tell them that this is manipulated and then they can make their own call.
But broadly, the idea seems to be we want to give people information and tell them that this is manipulated and then they can make their own call.
But broadly, the idea seems to be we want to give people information and tell them that this is manipulated and then they can make their own call.
I feel like the incentives for something like the music industry and for things that are basically aesthetic deep fakes, I think the incentives there are very different than they are for political manipulated imagery. That a lot of the question with YouTube is, okay, you are basically parodying someone in a way that may or may not legally be considered parody.
I feel like the incentives for something like the music industry and for things that are basically aesthetic deep fakes, I think the incentives there are very different than they are for political manipulated imagery. That a lot of the question with YouTube is, okay, you are basically parodying someone in a way that may or may not legally be considered parody.
I feel like the incentives for something like the music industry and for things that are basically aesthetic deep fakes, I think the incentives there are very different than they are for political manipulated imagery. That a lot of the question with YouTube is, okay, you are basically parodying someone in a way that may or may not legally be considered parody.
And we can make a deal where that person really, all they want is to get paid, right? And maybe they want something sufficiently controversial taken down. But if you give them some money, they'll be happy. That's just not really the issue at hand with political generated images. The problem there is around reputation. It's around people who do, at least in theory, care about.
And we can make a deal where that person really, all they want is to get paid, right? And maybe they want something sufficiently controversial taken down. But if you give them some money, they'll be happy. That's just not really the issue at hand with political generated images. The problem there is around reputation. It's around people who do, at least in theory, care about.
And we can make a deal where that person really, all they want is to get paid, right? And maybe they want something sufficiently controversial taken down. But if you give them some money, they'll be happy. That's just not really the issue at hand with political generated images. The problem there is around reputation. It's around people who do, at least in theory, care about.