J. Cal
๐ค SpeakerAppearances Over Time
Podcast Appearances
The child's mother sued TikTok, arguing that their algorithm served blackout challenge videos to her child, thus making them responsible. In the past, algorithmic decisions, as we've talked about here, were protected under Section 230 of the Communications Decency Act.
The child's mother sued TikTok, arguing that their algorithm served blackout challenge videos to her child, thus making them responsible. In the past, algorithmic decisions, as we've talked about here, were protected under Section 230 of the Communications Decency Act.
Just to break this down very simply, if you're Section 230, that grants internet platforms featuring user-generated content immunity from being sued over content published by those users on their platform. So YouTube, TikTok, Twitter, you know, a blogging platform, etc.
Just to break this down very simply, if you're Section 230, that grants internet platforms featuring user-generated content immunity from being sued over content published by those users on their platform. So YouTube, TikTok, Twitter, you know, a blogging platform, etc.
So because of 230, you're technically not supposed to be able to go after something like TikTok because a random user posted crazy videos like this. But an appeals court, Chamath, has reversed that ruling with the judge arguing that TikTok's algorithm represents a unique, quote unquote, expressive product, which communicates to users through a curated stream of videos.
So because of 230, you're technically not supposed to be able to go after something like TikTok because a random user posted crazy videos like this. But an appeals court, Chamath, has reversed that ruling with the judge arguing that TikTok's algorithm represents a unique, quote unquote, expressive product, which communicates to users through a curated stream of videos.
The judge claims TikTok's algorithms reflect editorial judgment. So Here's the interesting legal detail. The new ruling specifically cites the Supreme Court's recent decision, Moody v. Net Choice, in which the court ruled unanimously to vacate that Florida law we talked about here that banned tech companies from deplatforming political officials.
The judge claims TikTok's algorithms reflect editorial judgment. So Here's the interesting legal detail. The new ruling specifically cites the Supreme Court's recent decision, Moody v. Net Choice, in which the court ruled unanimously to vacate that Florida law we talked about here that banned tech companies from deplatforming political officials.
So that was viewed as a big win for speech protection sacks. and moderation rights in big tech. But this is all super ironic because the same ruling that affirmed big tech's First Amendment protections in content moderation may also have nullified the Section 230 immunity. So let's play this quick clip here.
So that was viewed as a big win for speech protection sacks. and moderation rights in big tech. But this is all super ironic because the same ruling that affirmed big tech's First Amendment protections in content moderation may also have nullified the Section 230 immunity. So let's play this quick clip here.
Chamath, this is a discussion you and I had about should algorithms be part of Section 230 back in 2022?
Chamath, this is a discussion you and I had about should algorithms be part of Section 230 back in 2022?
You're talking about the front page editor of the New York Times.
You're talking about the front page editor of the New York Times.
There's such an easy way to do this. If you're TikTok, if you're YouTube, if you want section 230, if you want to have common carrier and not be responsible for what's there, when a user signs up, it should give them the option. Would you like to turn on an algorithm? Here are a series of algorithms which you could turn on. You could bring your own algorithm.
There's such an easy way to do this. If you're TikTok, if you're YouTube, if you want section 230, if you want to have common carrier and not be responsible for what's there, when a user signs up, it should give them the option. Would you like to turn on an algorithm? Here are a series of algorithms which you could turn on. You could bring your own algorithm.
You could write your own algorithm with a bunch of sliders. Or here are ones that other users and services provide like an app store. So, Chamath, that was your take on it. And here we are. What are your thoughts on Section 230 and algorithms today? Should, if you use an algorithm, should that nullify, void your Section 230 protection?
You could write your own algorithm with a bunch of sliders. Or here are ones that other users and services provide like an app store. So, Chamath, that was your take on it. And here we are. What are your thoughts on Section 230 and algorithms today? Should, if you use an algorithm, should that nullify, void your Section 230 protection?
Yes, correct. Sax, your thoughts here on balancing 230 with the fact that I think we all agree that these algorithms are the new modern day editors.
Yes, correct. Sax, your thoughts here on balancing 230 with the fact that I think we all agree that these algorithms are the new modern day editors.