Nilay Patel
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's a space that's been totally upended by generative AI in a huge variety of ways, with an equally huge number of responses from artists, creatives, and the people who consume all of that art and creative out in the world.
Now, if you've been listening to Decoder or my other show, The Verge Cast, or even just reading The Verge over these past few years, you'll know that we've been talking about how the photos and videos taken by our phones are getting more and more processed and AI generated for years now.
And now, in 2026, we're in the middle of a full-on reality crisis, as fake and manipulated, ultra-believable images and videos flood onto social platforms at scale and without regard for responsibility or norms or even basic decency.
The White House is sharing AI-manipulated images of people getting arrested and defiantly saying it simply won't stop when asked about it.
We are just totally off the deep end now.
Whenever we cover this stuff, I get the same question from a lot of different parts of our audience.
Why isn't there a system to help people tell the real photos and videos apart from the fake ones?
Some people even propose systems to us.
And as it happens, Jess has actually spent a lot of time covering a few of these systems that exist in the real world.
The most promising is something called C2PA.
And her view is that so far, these systems have been almost entirely failures.
In this episode, we're going to focus on C2PA, since it's the one that has the most momentum.
It's a labeling initiative spearheaded by Adobe, with buy-in from some of the biggest players in the industry, including Meta, Microsoft, and OpenAI.
But C2PA, which is also sometimes referred to as Content Credentials, has some pretty serious flaws.
First, it was designed as more of a photography metadata standard, not an AI detection system.
And second, it's really been only half-heartedly adopted by a handful, but not nearly all, of the players you would need to make it work across the internet ecosystem.
We're at the point now where Adam Masseri, who runs Instagram, is publicly posting that the default should shift and that you should not trust images or videos the way that you maybe could before.
Think about that for one second.
That's a huge, pivotal shift in how society evaluates photos and videos.
And it's an idea I'm sure we're going to come back to a lot this year.