02 September 2024

This is a Big F%$#ing Deal

The 3rd Circuit Court of Appeals has ruled that TikTok is not protected from laibility by Section 230 of the DMCA for serving dangerous content to its users.

The legal chain here is fairly interesting.

Section 230 indemnifies the providers of digital communities from user created content.  So, for example, a newspaper article could be subject to a libel suit, but that newspaper would be protected from liability comments in the discussion sections of those articles.  The same sort of indemnification wouild be applied to bulletin board systems and social media.

It turns out that the US Supreme Court, in Moody v NetChoice, ruled state laws that forbade moderation based on political views had to be evaluated with consideration of the 1st Amendment implications, because the moderation, and the algorithms that perform that task, are protected editorial speech.

The details behind the lawsuit that was reinstituted are pretty horrible , a 10 year old girl was served a large number of, "Blackout Challenge," videos, and when she attempted to make a video, she accidentally hung herself.

The court that the serving of these videos, though not the videos themselves, was a deliberate editorial choice, and so is not covered by the Section 230 safe harbor:

A US appeals court has issued an opinion that could have wide-ranging implications for social media platforms, finding that content selected for users by TikTok's algorithms doesn't qualify for Section 230 protection.

In an opinion [PDF] published today, a three-judge panel from the Third Circuit Court of Appeals in Pennsylvania decided that, because TikTok presented "blackout challenge" posts to 10-year-old Nylah Anderson on her For You Page of recommended content, the platform deserves to be taken to court for her death that followed.

The "blackout challenge" refers to a dangerous self-asphyxiation "trend" that went around on TikTok several years ago. Anderson attempted to participate in the challenge, leading to her death, but a lower-court judge decided in 2022 that TikTok was protected by Section 230 of the Communications Decency Act (CDA), which protects social media platforms from liability for content posted by their users.

The Third Circuit court sharply disagreed.

"TikTok knew that Nylah would watch [the blackout challenge video] because the company's customized algorithm placed the videos on her 'For You Page' after it 'determined that the Blackout Challenge was 'tailored' and 'likely to be of interest' to Nylah,'" Judge Paul Matey wrote in a partial concurrence included in the decision.

Matey argued that Section 230's application has evolved far beyond the original intent when Congress passed the CDA in 1996. It is not to "create a lawless no-man's land" of legal liability.

"The result is a Section 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm," Matey said.

Judge Patty Shwartz wrote in the main body of the opinion that the Third Circuit's reading of Section 230 is reinforced by the recent Moody v NetChoice decision from the US Supreme Court. In that case, related to content moderation laws passed in Florida and Texas, SCOTUS held that algorithms reflect editorial judgments. Shwartz wrote that it's a compilation of third-party speech made in the manner a platform chooses, and thus merits First Amendment protection.

If I were a senior executive at Facebook, or Ecch (Twitter), or Reddit, I would be speed dialing my company attorney right now, because if this decision sticks, it makes many, if not most, of the business models for these companies are now untenable.

I rather imagine that there will be a circuit split, and that this will be decided at the US Supreme Court, and I have no clue as to how that might go.

0 comments :

Post a Comment