Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

Decoder with Nilay Patel

A jury says Meta and Google hurt a kid. What now?

02 Apr 2026

Transcription

Chapter 1: What is the main topic discussed in this episode?

0.689 - 27.422 Unknown

Support for the show comes from MongoDB. If you're a developer stuck fixing bottlenecks, instead of building the next big thing, then you need MongoDB. MongoDB is the flexible, unified platform that gets out of your way. It's ACID compliant, enterprise ready, and built to ship AI apps fast. It's trusted by so many of the Fortune 500 for a reason. Ask any developer. It's a great freaking database.

0

27.402 - 35.269 Unknown

Start building at MongoDB.com slash build.

0

35.705 - 47.504 Nilay Patel

Hello and welcome to Decoder. I'm Neil I. Patel, editor-in-chief of The Verge, and Decoder is my show about big ideas and other problems. Today, we're talking about the landmark social media addiction trials that just resulted in two major verdicts against Meta and Google.

0

48.005 - 62.067 Nilay Patel

There's one case in New Mexico against Meta, and another in California against both companies, who both have said they plan to appeal. These are complicated cases with some huge repercussions for how these platforms work and the very nature of speech in America.

0

62.047 - 87.97 Nilay Patel

So to help us work through it all, I've brought on two heavy hitters, my friend Casey Newton, the founder and editor of the excellent newsletter Platformer and co-host of the Hard Fork podcast, as well as Verge senior policy reporter Lauren Feiner, who's actually in that Los Angeles courtroom where executives like Mark Zuckerberg took the stand in the case of a 20-year-old woman named Kayleigh, who successfully argued that Meta and Google negligently designed their platforms in ways that contributed to her mental health issues.

87.95 - 106.408 Nilay Patel

These cases, the first in a wave of injury lawsuits targeting tech companies, are about the design decisions of platforms like Instagram and YouTube. They argue that the platforms have fundamental design flaws that harm users, especially teenagers, and that these companies knew about these problems and were negligent in shipping these features anyway.

106.428 - 127.462 Nilay Patel

These cases are part of a much larger set of moves that aim to fundamentally change the legal mechanisms that exist that might regulate social media. Now, harm in the context of these cases isn't just addictive design that brings users back compulsively. It's also features like algorithmic recommendations and camera filters that make issues like anxiety, depression, and body dysmorphia worse.

128.263 - 143.927 Nilay Patel

This emphasis on how the platforms work, as opposed to the content, is part of a movement that has been building for years, focused on the argument that social media is not and cannot be healthy. That, in fact, these products might be defective, the same way that cigarettes, when used as designed, cause cancer.

143.907 - 148.812 Nilay Patel

That's a lot of complicated ideas, and Casey and Lauren and I really spent some time working through them.

Chapter 2: What are the landmark verdicts against Meta and Google about?

303.437 - 309.329 Nilay Patel

Casey Newton, your founder and editor and platformer, and I would say forever Silicon Valley editor here at TheVerge.com.

0

309.65 - 313.538 Casey Newton

I do continue to identify as the Silicon Valley editor of The Verge, so I'm glad you feel the same way.

0

314.277 - 333.87 Nilay Patel

You can check out, but you can never leave, buddy. Welcome, both of you, to Decoder. I want to talk about these trials that a bunch of social media companies faced in California and New Mexico. Lauren, at a high level, you were in the room for at least the trial in California. I think Snap and TikTok settled that one. They were out. YouTube and Meta just lost a jury verdict.

0

334.251 - 338.378 Nilay Patel

Describe what happened in those trials and what you saw in the courtroom while you were there.

0

338.712 - 364.195 Lauren Feiner

At their core, these trials were about the design decisions that social media companies make, how users are going to interact with. what comes across their feeds. It was really trying to get around a problem that has been going on with tech for a long time around, can you separate design from content on these platforms? That's what these trials were trying to get at.

364.576 - 392.767 Lauren Feiner

And what came out at trial in the courtrooms were a lot of internal documents from these companies. In the LA case, it was Meta and YouTube. And in New Mexico, it was just Meta. And we saw lots of internal documents, lots of former Meta employees turned whistleblowers take the stand to discuss the decisions they made and the things they saw. So that was a lot of what we saw in the courtroom.

392.907 - 400.596 Lauren Feiner

And in L.A., we even saw the head of Instagram, Adam Masseri, and the CEO of Meta, Mark Zuckerberg, take the stand.

400.998 - 417.657 Nilay Patel

Casey, everyone's calling these bellwether trials. We call them bellwether trials on The Verge. The whole industry has decided this is a word we're going to use. Can you just quickly explain what that means? You've been covering attempts to regulate these companies forever. And the idea that these trials are a bellwether seems particularly meaningful here.

417.637 - 433.047 Casey Newton

As you know, Nilay, for basically the past 20 years, companies have been able to use Section 230 as a shield, and whenever there is any remotely content-related challenge to any of these platforms in court, they just get dismissed out of hand.

Chapter 3: How do social media design flaws impact mental health?

536.766 - 555.566 Casey Newton

So the reason that that was important was all of a sudden the 230 shield isn't absolute, right? There have already been a couple of minor exceptions for like, you know, you can't post about, the platforms have to remove like terrorism and CSAM. But now we're saying, okay, you can't actually offer a filter like this because it might incentivize a terrible behavior.

0

555.946 - 573.927 Casey Newton

This is what sort of opens up the rest of the landscape for the plaintiff's attorneys. They're able to say like, well, what other design features are there of these platforms and what incentives are they creating? We're not going to talk about, you know, the actual messages that are being traded back and forth on Snapchat or, you know, the actual content of the post on the Instagram feed.

0

573.947 - 586.626 Casey Newton

But we are going to ask about things like infinite scroll and autoplay video and push notifications that arrive continuously throughout the night and might disrupt your sleep. And all of a sudden they were able to find purchase because they had that initial precedent.

0

586.808 - 599.401 Nilay Patel

The thing that really grabs me about that is Snapchat had made that filter. That was Snapchat's speech. They were the ones saying, well, if you drive fast, we'll generate a speedometer reading for you. And in this case, it's still not the platform speech.

0

600.162 - 600.443 Unknown

Right.

600.603 - 616.68 Nilay Patel

You can make an infinite scroll. You can make autoplay videos. And that is just ways that they are managing the speech of others. Did they have to overcome that? Because that seems like where you would hit the 230 rocks over and over again and say, we're just managing the speech of others. It's still the First Amendment.

617.133 - 642.87 Casey Newton

I think that the plaintiffs were able to successfully argue infinite scroll is not the speech of others, right? There's no sort of liability of another person that gets involved here. It's you built a product and the product is defective, right? They were able to successfully liken these things to cars without seatbelts. And it just really resonated with jurors. And I think it's worth...

642.85 - 659.905 Casey Newton

Taking a minute to talk about why that might be, because I think this is something that the people that I talk to at the social media companies never seem to understand. Everybody knows someone who has a huge problem with Instagram. This person is probably in your immediate family. They have deleted it a hundred times off their phone and they always reinstall it.

659.925 - 675.781 Casey Newton

They've set the screen time limits, but they keep coming back over and over again and they hate themselves for it, right? This is a near universal experience in America now. And so when you sit a jury down and you say, there's something wrong with Instagram, it's pretty easy to find a lot of people who say, that sounds right to me.

Chapter 4: What role does Section 230 play in social media regulation?

959.054 - 973.555 Casey Newton

creating a platform whose only incentives will ever be to get you to look at it as much as humanly possible. So that's why I think the scrutiny is finally drifting over to those things, right? We don't want to get rid of an internet. We don't want to get rid of your right to be able to post your opinion online.

0

973.855 - 981.045 Casey Newton

We want to get rid of this kind of machine that just increasingly seems like it's taking more and more of your time and attention in ways that make you feel bad.

0

983.829 - 985.652 Nilay Patel

We need to take a quick break. We'll be right back.

0

988.298 - 1017.102 Unknown

Support for the show comes from Zapier. We cover a lot of trends on this show, so of course that means discussing AI. But there's a difference between talking about something and what it does in action. That's where Zapier comes in. Zapier is a way for you to break the hype cycle and put AI to work across your company, for real. Zapier helps you actually deliver on your AI strategy.

0

1017.722 - 1034.774 Unknown

With Zapier's AI orchestration platform, you can bring the power of AI to any workflow, so you can do more of what matters. You can connect top AI models like ChatGPT and Cloud to the tools your team already uses. That way, you're only using it exactly where you need it.

1035.235 - 1056.412 Unknown

Whether that's AI-powered workflows, an autonomous agent, a personal customer chatbot, or something else, you can orchestrate it all with Zapier, whether you're a tech expert or not. According to their data, teams have already automated over 300 million AI tasks using Zapier. Join the millions of businesses transforming how they work with Zapier and AI.

1056.914 - 1106.442 Unknown

Get started for free by visiting zapier.com slash decoder. That's Z-A-P-I-E-R dot com slash decoder. to automatically receive new episodes every Thursday. Support for the show comes from MongoDB. If you're tired of database limitations and architectures that break when you scale, it's time to think outside of rows and columns.

1107.023 - 1132.535 Unknown

Because let's be honest, you didn't get into tech to babysit a broken database. You got into it to actually build something. MongoDB lets you do that. It's flexible, developer-first, asset-compliant, enterprise-ready, and built for the AI era. Say goodbye to bottlenecks and legacy code. Start innovating with MongoDB. There's a reason it's trusted by so many of the Fortune 500.

1133.216 - 1144.868 Unknown

And that's because it's a platform built by developers for developers. MongoDB. It's a great freaking database. Start building at mongodb.com slash build.

Chapter 5: How do the recent trials challenge existing legal protections for tech companies?

1190.875 - 1207.039 Nilay Patel

They've been held liable for these product features. There's some conversation that we should have at the industry that the United States of America is going to have about the difference between free speech and product features. We'll come back to that. But in the meantime, they've got to do something, right?

0

1207.059 - 1222.695 Nilay Patel

They've got to change something about how their products work to avoid ongoing liability from anyone else who might look at these cases and say, we're going to see you too. Casey, this feels like a trust and safety problem, right? This is your audience. This is the people you talk to the most. What is their reaction to this?

0

1223.182 - 1245.665 Casey Newton

Their reaction is really negative. I mean, in particular, talking to people who still work there and what they'll say is like, even if you buy the plaintiff's arguments here, fixing this is really tricky, right? Because again, even if you believe that this individual teenager or had like a horrible time looking at these platforms for too long and it made all of her problems worse.

0

1246.186 - 1250.372 Casey Newton

Okay, which design feature of this platform are you going to remove?

0

Chapter 6: What are the implications of algorithmic recommendations on user behavior?

1250.452 - 1270.34 Casey Newton

And how is that going to fix her problem, right? Like if Instagram and YouTube did not have autoplay video, if it didn't have infinite scroll, if it didn't have push notifications, would that have improved her mental health to a point where she no longer would have sued the company saying this is a defective product? I don't know, right?

0

1270.38 - 1289.554 Casey Newton

I think that the problem that we just have as like a society right now is we don't know what safe social media is. We don't know what features are really the most dangerous. I think we have instincts. I think there are experiments that we should run, but it's not as simple as, well, just turn off the autoplay video and all the teenagers will go play outside again.

0

1289.77 - 1293.594 Nilay Patel

Is it as simple as none of the teenagers in Australia should use social media?

0

1293.895 - 1311.614 Casey Newton

Here's the thing. As somebody who writes more about social media than anything else, I have been shocked at the degree to which I am just throwing in my lot with Jonathan Haidt. Because I also don't know. I do not know which are the features that we should get rid of that are going to make all the teenagers safe.

0

1311.874 - 1325.429 Casey Newton

What I can tell you is nobody who works at the platforms cares enough about any of your teenagers for me to trust your teenagers with them. So I would rather say don't look at it until you turn 16 because I know that's going to be better for you than them looking at it.

1326.27 - 1342.608 Nilay Patel

So I think we can hear Casey, who talks to the people who work for the platform companies, fully crashing out about that experience. Lauren, you talk to policymakers all day long. Nominally, you are our policy reporter in D.C. You cover Capitol Hill. We don't send you to courtrooms all day and all night, although that's what you've been doing.

1343.469 - 1347.553 Nilay Patel

On that side of the house, what are the policymakers doing in reaction to these verdicts?

1347.888 - 1374.295 Lauren Feiner

So far, we've seen a big push from the lawmakers who are behind some of the biggest social media reform laws like Kids Online Safety Act saying, well, this just shows that we need these new laws or we need to repeal old laws like Section 230. in order to make kids safe. So I think that is the big push right now. I think, you know, it's still really early days though.

1374.415 - 1399.267 Lauren Feiner

And I am going to be really interested to see is that kind of where the momentum moves or is there maybe even a kind of a counterbalance to that that says maybe let's slow down because actually the sort of cases we thought wouldn't be able to go through through the courthouse are actually moving forward. And they're doing so even with Section 230 in place, even without COSA.

Comments

There are no comments yet.

Please log in to write the first comment.