Menu
Sign In Search Podcasts Charts Entities Add Podcast API Pricing

Kate Klonick

👤 Person
199 appearances

Podcast Appearances

Radiolab
Content Warning

I think that that's exactly right.

Radiolab
Content Warning

I mean, that's what we go to the movies for.

Radiolab
Content Warning

That's what we turn into like certain types of things for, right?

Radiolab
Content Warning

It's I'm not in the mood for, you know, a horror film.

Radiolab
Content Warning

So I don't go to a horror film.

Radiolab
Content Warning

This kind of approach is much easier to moderate.

Radiolab
Content Warning

People get much less upset.

Radiolab
Content Warning

And it's much cheaper.

Radiolab
Content Warning

Because there is not as much reactive content moderation to do.

Radiolab
Content Warning

You don't have to employ hundreds of people in call centers to review every report of something that's been flagged.

Radiolab
Content Warning

And so this has kind of become the new standard.

Radiolab
Content Warning

No, I mean, I've always liked the mall metaphor and it has a weird squirrely little place in First Amendment law in a bunch of cases.

Radiolab
Content Warning

But I want to hear what you kind of want to hear what yours is.

Radiolab
Content Warning

A hundred percent.

Radiolab
Content Warning

You know, you can shadow ban or take down or limit the reach, but it doesn't even have to be that subtle.

Radiolab
Content Warning

Like Elon Musk always showing up in my feed, even though I don't follow Elon Musk, is like having Rupert Murdoch in like the interstitial spaces before every commercial break at Fox News, you know, like directly telling me what I should think.

Radiolab
Content Warning

That isn't subtle.

Radiolab
Content Warning

Like that is the other thing about this that is maybe the scariest part of the last couple of months is that none of it even is super pretextual.

Radiolab
Content Warning

Like there isn't a lot of like excuses.

Radiolab
Content Warning

We're not even hiding behind algorithms anymore.

Radiolab
Content Warning

It is just the owner of the platform saying the thing out loud and forcing everyone to see it if they're on his platform.

Radiolab
Content Warning

You know, I think that if you're going to all of these different platform islands, the other thing is, like, how do we change those?

Radiolab
Content Warning

To use regulatory regimes to try to control how they speak is obviously a problematic thing by any type of measure.

Radiolab
Content Warning

We don't want governments controlling speech for the exact reason of all of the authoritarianism we've just discussed.

Radiolab
Content Warning

And so I think that there's... It's very hard... Sorry, if I can jump in there, though, but it does feel like...

Radiolab
Content Warning

No, I mean, like every Western state has some type of media regulator specifically to avoid maybe like two or three people controlling all of media.

Radiolab
Content Warning

But all of a sudden we're like on the internet and yes, there is an infinite amount of content on the internet, but is it so infinite?

Radiolab
Content Warning

Like if there are, if we're talking about like the same three main places that people are going to for their news, people are going to for like their, for their daily interactions, people are going to, to feel like they're part of a conversation, their water cooler, their public square, whatever it is.

Radiolab
Content Warning

If that is like three people and they're all friends of the president, like that's a problem.

Radiolab
Content Warning

And maybe even more importantly, journalists, they go to X, they go to Blue Sky, they go to YouTube, they go to TikTok.

Radiolab
Content Warning

And they report things that are happening in those places as if they're real places that things are happening.

Radiolab
Content Warning

But they're also controlled by these individuals.

Radiolab
Content Warning

And so they're not reflective necessarily of real world, yet they are being reported on as if they were reflective of real world.

Radiolab
Content Warning

I just think that what you see in the last five years is an industry understand the power that it holds in content moderation, that it's so not a customer service issue, that it is actually like a huge, huge force

Radiolab
Content Warning

And now it is a problem of, okay, how do we stop billionaires and authoritarian governments from twisting these platforms into censorship machines or political propaganda?

Radiolab
Content Warning

for shaping public opinion and that that has exponential value to political parties and governments.

Radiolab
Content Warning

It's like as valuable as oil and guns because how you push things, what you keep up, what you take down.

Radiolab
Content Warning

I mean, this is how you can basically create

Radiolab
Content Warning

You know, the rise and fall of presidencies, if you want to, or political parties.

Radiolab
Content Warning

And they know how to market them to you, no matter how niche you are.

Radiolab
Content Warning

And that's scalable.

Radiolab
Content Warning

And so, like, it's a way to make a lot of money, and then it's a way to control a lot of minds.

Radiolab
Content Warning

You know, it wouldn't be the first time that someone has told me that in some way I'm a useful idiot to Facebook or in some type of capacity.

Radiolab
Content Warning

I feel as if a lot of people, and a lot of what we've said today, people will be like, of course this is what happened.

Radiolab
Content Warning

This is what we were saying would happen.

Radiolab
Content Warning

But it wasn't fait accompli when we talked about it.

Radiolab
Content Warning

Every single one of these solutions has the same...

Radiolab
Content Warning

flaw at the end of the day to it which is that these are for-profit companies that do what they want to do and things change um as things settle so i don't know okay well so then like is content moderation sort of dead i just um

Radiolab
Content Warning

Yeah, this is like a very controversial thing.

Radiolab
Content Warning

It really depends on what you mean by that question.

Radiolab
Content Warning

There has been a lot of controversy around, like, are they going to invest in these huge cost centers of trust and safety?

Radiolab
Content Warning

Are they going to care about this type of issue?

Radiolab
Content Warning

if they can, TikTokify everything and just send you down these rabbit holes of endlessly drooly, like, eye-glaze-over, like, WALL-E kind of scene where you're on the couch with your Slurpee, like, Barca lounger or whatever, like, watching things.

Radiolab
Content Warning

Is that what they're basically going to do and are they going to have to keep moderating?

Radiolab
Content Warning

And I mean, I think that, like, the answer is that we're going to increasingly see a automated content moderation system.

Radiolab
Content Warning

It's going to increasingly not embody...

Radiolab
Content Warning

the edges of society and the range of voice that we had at the beginnings of the internet.

Radiolab
Content Warning

And that we are going to kind of see a productification of speech.

Radiolab
Content Warning

That's kind of how I feel, too.

Radiolab
Content Warning

Sounds like a Ted Chiang story.

Radiolab
Content Warning

But you should rate that.

Radiolab
Content Warning

Maybe you can ask AI to do it for you if you're really busy.

Radiolab
Content Warning

I think we're using this one.

Radiolab
Content Warning

So the main thing, the main thing from the last time we talked that has really, truly changed from like 2020 to 2025 is the rise of TikTok.

Radiolab
Content Warning

I mean, if you will remember, like in two short years, it had basically caught up with 12 years of Facebook's growth.

Radiolab
Content Warning

And I mean, TikTok has a different way that they run their content moderation.

Radiolab
Content Warning

Well, when we spoke in these past episodes, one of the assumptions of content moderation when it was getting off the ground, via Facebook or Instagram or YouTube, was that we don't want to censor people unnecessarily.

Radiolab
Content Warning

And so you would keep content up until it was reported as being harmful.

Radiolab
Content Warning

And then you would make rules that would limit and try to preserve voice

Radiolab
Content Warning

as much as possible, as they put it.

Radiolab
Content Warning

That was like the industry term for free speech, voice.

Radiolab
Content Warning

There were limits to that, obviously.

Radiolab
Content Warning

But generally, like, it was a keep it up unless we have to take it down type of thing.

Radiolab
Content Warning

But that's not TikTok.

Radiolab
Content Warning

TikTok comes from obviously China and it comes from a censorship kind of authoritarian CCP culture.

Radiolab
Content Warning

And I mean, I believe the Chinese kind of approach to speech is very reflected in the algorithm that TikTok uses.

Radiolab
Content Warning

It is not a default.

Radiolab
Content Warning

Everyone should see everything.

Radiolab
Content Warning

This is a free world and people have a right to say whatever they want, even if it's a private platform.

Radiolab
Content Warning

It is a we get to determine what people see and say.

Radiolab
Content Warning

TikTok, it prescreens such a volume of content that it determines to not be outside of certain political parameters.

Radiolab
Content Warning

And so they're less likely to cause negative interaction effects, to put kind of an economic term on it.

Radiolab
Content Warning

That's a perfect way of thinking about it.

Radiolab
Content Warning

And they push things up that are very milquetoast, very like happy, make you feel good, very apolitical.

Radiolab
Content Warning

And so this is basically downranking or shadow banning.

Radiolab
Content Warning

The idea that you're going to manipulate the algorithm to not delete the content, but not promote it.

Radiolab
Content Warning

And in addition to that, the algorithm is constantly improving and iterating on all the behavioral signals that you give it.

Radiolab
Content Warning

And so it's able to provide a very addictive and...

Radiolab
Content Warning

expectation meeting product.

Radiolab
Content Warning

I mean, there's no way I'm like almost an experience, but I'm like, yeah, it's kind of, but it's not, it's, it's, I don't know what it is.

Radiolab
Content Warning

I'm a professor at St.

Radiolab
Content Warning

I don't have TikTok.

Radiolab
Content Warning

You don't either.

Radiolab
Content Warning

Well, I have like rules for some of these things.

Radiolab
Content Warning

John's Law School.

Radiolab
Content Warning

But, you know, I study online speech for a living, so it seems kind of crazy.

Radiolab
Content Warning

But I don't need to actually be on TikTok for TikTok to be all over my life.

Radiolab
Content Warning

I see TikTok videos constantly.

Radiolab
Content Warning

They're cross-posted.

Radiolab
Content Warning

I don't need to actually be on TikTok.

Radiolab
Content Warning

Oh, I mean, I think that it's actually fascinating.

Radiolab
Content Warning

You know, what they figured out is it is a format, a video that people are hooked by.

Radiolab
Content Warning

And so it does not really matter.

Radiolab
Content Warning

You will find yourself often watching things that you didn't know you were interested in, but like you're just compelled by certain types of couples that like look very different from each other doing any type of like interaction.

Radiolab
Content Warning

Yes, that's like one way of thinking about it.

Radiolab
Content Warning

I mean, you know, but this is not new.

Radiolab
Content Warning

I mean, like advertisers have been doing this forever.

Radiolab
Content Warning

Like this is, you know, it's just a very different business model.

Radiolab
Content Warning

It is a very different product model.

Radiolab
Content Warning

Yeah, it's controlled.

Radiolab
Content Warning

But it's also in like a certain way is even more dangerous because like the ultimate in censorship in American First Amendment law is really prior restraint.

Radiolab
Content Warning

Don't get me saying the F word again because last time my parents yelled at me.

Radiolab
Content Warning

Prior restraint is censorship before something goes up or is ever published.

Radiolab
Content Warning

That is the exact distinction.

Radiolab
Content Warning

And it's important because the existence of this redaction, the proof that it was removed from Facebook, is actually evidence that censorship has happened, right?

Radiolab
Content Warning

Whereas with TikTok, you never even know what you missed.

Radiolab
Content Warning

You never even know what you were kept from seeing.

Radiolab
Content Warning

And that is really, unfortunately, what we're staring down at this moment because in the last five years...

Radiolab
Content Warning

American social media has moved towards TikTok's approach to content moderation.

Radiolab
Content Warning

Yeah, they're like, Kate, you're an adult now.

Radiolab
Content Warning

You're a serious person.

Radiolab
Content Warning

That is not as clear.

Radiolab
Content Warning

But the biggest sea change is the one that you're thinking of.

Radiolab
Content Warning

which is the one that happened on January 7th of this year, 2025, when Mark Zuckerberg announced the end of the fact-checking program.

Radiolab
Content Warning

And that he was going to try to move towards a community notes-based system of content moderation.

Radiolab
Content Warning

And I mean, I think that, like, it was and it wasn't a sea change.

Radiolab
Content Warning

Okay, so not much, which is why this was a really... Okay.

Radiolab
Content Warning

Which is why this was such a...

Radiolab
Content Warning

Such a frustrating announcement, and it was frustrating that the media focused on it so much.

Radiolab
Content Warning

The fact-checking was like a commitment to fact-checking because there had been so much clamor about mis- or disinformation.

Radiolab
Content Warning

But they were removing posts days after they were flagged, and, like, it was very small.

Radiolab
Content Warning

And so to watch it go on the chopping block was really more of a signal.

Radiolab
Content Warning

To a very particular person and to a very particular party that felt like big tech censorship was coming for them.

Radiolab
Content Warning

And like, you know, we can get into a whole kind of conversation about whether or not that was reality based, but that was kind of the complaint.

Radiolab
Content Warning

And you can even go before the pandemic.

Radiolab
Content Warning

There's a few things.

Radiolab
Content Warning

The Hunter Biden laptop scandal.

Radiolab
Content Warning

Reporting lays out purported emails between Hunter Biden and a Ukrainian businessman.

Radiolab
Content Warning

New York Post, they broke the story and links to that were taken off Facebook and Twitter.

Radiolab
Content Warning

That was absolutely censored.

Radiolab
Content Warning

Well, that was happening a couple weeks before the 2020 election.

Radiolab
Content Warning

And so what had been the huge concern for Facebook and all these other companies was how social media impacted the 2016 election.

Radiolab
Content Warning

And so they made a lot of big changes.

Radiolab
Content Warning

And one of them was just kind of like, we're not going to allow things that could possibly be foreign influence to stay up because this is exactly what we got yelled at in 2016.

Radiolab
Content Warning

And so they kind of overcorrected.

Radiolab
Content Warning

And I think in hindsight, it was a really hard call and maybe probably the wrong one.

Radiolab
Content Warning

And then you extend that to the Wuhan lab leak.

Radiolab
Content Warning

Now, those were just insane, insane issues.

Radiolab
Content Warning

We're still talking about them today.

Radiolab
Content Warning

It's not like they were that censored, unlike going to, say, China, where it's like you're like, oh, you know, Tank Man.

Radiolab
Content Warning

And they're like, who?

Radiolab
Content Warning

Because there are no photos of Tank Man.

Radiolab
Content Warning

They are not published.

Radiolab
Content Warning

And so it's not like I just also.

Radiolab
Content Warning

My honest belief, I can't predict the future, but my honest belief is this administration would very quickly put the platforms in line.

Radiolab
Content Warning

Yeah, I think that there would be no hesitation to do this because I don't think that this was ever about free speech.

Radiolab
Content Warning

It was about their speech.

Radiolab
Content Warning

And that is really what you're unfortunately seeing right now.

Radiolab
Content Warning

There is no recognizable free speech notions coming out of this current administration.

Radiolab
Content Warning

And with the TikTokification of social media, people have seen the vector for power that is in content moderation.

Radiolab
Content Warning

Yeah, I think that basically what you're seeing is the power over what appears in your feed or doesn't appear in your feed or the types of new content that you're recommended or the first commenters that you see on a video that you just watched.

Radiolab
Content Warning

That type of control is an ability that we've never seen before.

Radiolab
Content Warning

I remember when I was first writing about this in like 2017, 2018, presenting my research, one of the things that people were so concerned with was filter bubbles.

Radiolab
Content Warning

Well, we're going to be in these filter bubbles fed to us by the algorithm.

Radiolab
Content Warning

And as it turns out, that was one very true that that would happen.

Radiolab
Content Warning

But also, even maybe more disturbingly, we don't even need filter bubbles anymore.

Radiolab
Content Warning

People are just choosing platforms based on the types of content that they expect to find there.