Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Last Show with David Cooper

Revenge Porn Image Removal and Google

13 Feb 2026

Transcription

Chapter 1: What is the main topic discussed in this episode?

0.031 - 25.307 David Cooper

For those who know that questioning everything includes questioning this show's existence. The Last Show with David Cooper. If you've ever Googled yourself and you didn't quite like what you found, maybe personal information on the internet that you wish wasn't there, your address, your phone number, maybe images of you.

0

25.347 - 39.741 David Cooper

Historically, Google wouldn't really take that stuff down unless you did some wacky legal stuff to them. But now it seems like they are allowing users to take off personal information from search results. Let's talk about this here with Carmi Levy, a tech expert.

0

39.721 - 67.737 David Cooper

analyst it is time for technology time carmy welcome in good to be with you thanks so much for having me david i kind of buried the lead this also goes for like explicit pictures which i think you know average people due to an unfortunate set of circumstances like revenge pornography this kind of thing can find awful stuff on the internet publicly available and a google who has historically in some cases not taken that stuff down it looks like they're doing a 180 on this one

0

67.717 - 91.075 Carmi Levy

Yeah, finally. I mean, yeah, non-consensual explicit images are rampant online. And in many cases, the victims did not intend for them to be online. They were posted by someone, perhaps a vengeful ex or someone else, a cyber criminal. And so they find when you go searching for your name, suddenly this thing pops up and you realize it's incredibly damaging. And so

0

91.055 - 95.522 Carmi Levy

They're building it into the features called results about you.

Chapter 2: How can Google help remove explicit images from search results?

95.582 - 121.475 Carmi Levy

And whenever you search for an image, if you feel that it qualifies as that, you can have Google, you can ask them to remove it from their index so that when someone searches, It will not show up in a Google search. Now, the fine sort of line here is that the original image will still be online. It's just that Google will not find it when you search for it. That's the key difference.

0

121.495 - 134.528 Carmi Levy

You still have to go to the original website, the original platform to have it removed. But at least it's a lot harder for somebody to find. It does a lot more to protect you from ensuring that other people will not see this content.

0

134.643 - 157.225 David Cooper

There was an irony, a perverse irony that existed in the U.S. At least there's something called the Digital Millennium Copyright Act. And it's basically so like musicians can if my song ends up on YouTube, I can ask YouTube to take it down because it's my copyrighted creative work. So if I was an adult entertainer, let's say I had an OnlyFans or something and I posted explicit photos of myself.

0

157.205 - 172.706 David Cooper

That's a copyrightable creative work. So if I was a professional adult entertainer, I could say, hey, Google, take down these copyrighted images and Google would do it. But if I was just like a private person whose photos ended up on the Internet, it would be so difficult to take down.

0

173.207 - 185.203 David Cooper

To me, that's so wild that someone who does it professionally, who I guess wouldn't care if the world saw it, they could take it down. But me as a private person couldn't before. At least now there's a little bit of fairness there.

185.183 - 200.157 Carmi Levy

Yeah, it shows just how much the Internet has evolved since these original tools, since the DMCA, which I believe was enacted something like 20 years ago, first became a thing. And some non-consensual imagery has become epidemic online.

200.197 - 215.673 Carmi Levy

And so, you know, I think Google is recognizing that a lot of the tools that are used to support DMCA takedown requests can also be redeployed, so to speak, for this kind of activity as well. It's just another layer of protection.

215.693 - 238.614 Carmi Levy

They're also going to include a feature that when you tag something, so you see a photo, it's non-consensual, it's explicit, you click on the three dots, you flag it to remove it, and then it will then, in its index, If someone else searches for it or their other matches, if similar imagery shows up in Google's index, you'll be notified.

238.674 - 259.7 Carmi Levy

And so there's automation then that also protects you going forward. It's not just that specific image data. It's others. And so I think it recognizes that the internet's a very dangerous place. Search is a very powerful way to victimize other people. And Google is finally stepping up. and ensuring that its platform, at the very least, can't be weaponized.

Chapter 3: What measures are being taken to combat non-consensual imagery online?

913.887 - 919.733 David Cooper

Isn't that the great joy of getting in one of these things? You can get in the back and pick your nose and no driver is going to see you through the rear view window.

0

919.934 - 924.118 Carmi Levy

But the camera certainly will. Be careful there because we know that information is being stored somewhere.

0

924.158 - 943.4 David Cooper

We don't know where. Um, okay. We have just a few moments here. How bad have deep fakes gotten? Like if I'm likely to scroll through the internet, like what's the, what are the odds these days if I'm just doom scrolling, going through a random news site that isn't highly reputable or in some cases is highly reputable, how likely am I to see a deep fake these days?

0

943.58 - 959.192 Carmi Levy

I Funny you should ask. Very. So there's something, it comes from MIT, it's called the AI Incident Database. And what they're warning is that just over the last year alone, AI deep picks have gotten so much better. And there's growing evidence that they're being used in significantly more

0

959.172 - 981.059 Carmi Levy

So impersonation for profit, you know, so they're deep faking elected leaders, celebrities, even doctors promoting skin creams, things like that. And people are being bilked out of, you know, not just small amounts of money, but significant amounts of money. And the technology is so good. that, you know, you used to be able to say, oh, that's just a terrible video, right?

981.079 - 990.85 Carmi Levy

The lips don't align with, you know, the syncing between the audio and the video. It's clear that it's a scam. This is a deepfake. Well, it's getting harder and harder to tell.

990.89 - 1003.027 Carmi Levy

And we're at the point now that, you know, if you see, for example, Elon Musk hawking cryptocurrency, it looks very much like Elon Musk and a lot of people are being taken in many cases for hundreds of thousands or even millions of dollars. Great.

1003.348 - 1021.637 David Cooper

That's great news. I'm so happy to hear that. The Internet's fantastic. The Internet sucks these days, man. I long for the days of like dial up, you know, where you wait five minutes for one image to load. I miss my AOL account. I truly do. Carmi Levy. He's a tech analyst. He joins us every week for It's Time for Technology Time. Carmi, it's been a joy having you on the show.

1021.657 - 1022.579 David Cooper

Thank you so much for being here.

Comments

There are no comments yet.

Please log in to write the first comment.