Stunning court ruling lift's Section 230 protection for web platform over girl's death

 August 30, 2024

This story was originally published by the WND News Center.

Section 230 of the Communications Decency Act provides vast protections, literal immunity, for web platforms for content users post to the software programs

Section 230 of the Communications Decency Act provides vast protections, literal immunity, for web platforms for content users post to the software programs.

But a ruling from the 3rd U.S. Circuit Court of Appeals now has said that when those platforms PROMOTE users' speech, it becomes their own. And their responsibility could be vast.

The decision came in the case of Tawainna Anderson, as administrator for the estate of N.A., a deceased minor, against TikTok and its parent, Bytedance.

The court explained the dispute: "TikTok, Inc., via its algorithm, recommended and promoted videos posted by third parties to ten-year-old Nylah Anderson on her uniquely curated 'For You Page.' One video depicted the 'Blackout Challenge,' which encourages viewers to record themselves engaging in acts of self-asphyxiation. After watching the video, Nylah attempted the conduct depicted in the challenge and unintentionally hanged herself."

Her mother, Tawainna Anderson, sued and the district court dismissed her based on Section 230's protections for web corporations over what their users post.

The situation would have been different had the girl searched for and found with offending video on her own. But in this case, the company's algorithms found the video, identified it, and then promoted it to her, the ruling pointed out.

That makes it the company's speech, the ruling found.

"TikTok's recommendations via its FYP algorithm – is TikTok's expressive activity," the ruling said, so the Section 230 protection "does not bar Anderon's claims."

"We reach this conclusion specifically because TikTok's promotion of a Blackout Challenge video on Nylah's FYP was not contingent upon any specific user input. Had Nylah viewed a Blackout Challenge video through TikTok's search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content. Given the type of algorithm alleged here, we need not address whether § 230 immunizes any information that may be communicated by the results of a user's search of a platform's content," the ruling said.

The case was returned to the district court for determinations on that claim, as well as "any claims not premised upon TikTok's algorithm."

A concurrence in the majority opinion turned blunt: "TikTok reads § 230 of the Communications Decency Act, 47 U.S.C. § 230, to permit casual indifference to the death of a ten-year-old girl. It is a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation, one that smuggles constitutional conceptions of a 'free trade in ideas' into a digital 'cauldron of illicit loves' that leap and boils with no oversight, no accountability, no remedy. And a view that has found support in a surprising number of judicial opinions dating from the early days of dial-up to the modern era of algorithms, advertising, and apps. But it is not found in the words Congress wrote in § 230, in the context Congress acted, in the history of common carriage regulations, or the centuries of tradition informing the limited immunity from liability enjoyed by publishers and distributors of 'content.' As best understood, the ordinary meaning of § 230 provides TikTok immunity from suit for hosting videos created and uploaded by third parties. But it does not shield more, and Anderson's estate may seek relief for TikTok's knowing distribution and targeted recommendation of videos it knew could be harmful."

Those comments were from Circuit Judge Paul Matey.

The majority opinion said, "TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its first-party speech."

report at Just the News noted, "In a footnote likely prompting meetings between engineers and in-house counsel across Silicon Valley and other tech-heavy metros, the three-judge panel emphasized that TikTok's recommendation to Nylah was 'not contingent upon any specific user input.'"

The report noted American Economic Liberties Project Research Director Matt Stoller opined that "It'll take a bit of time, but the business model of big tech is over."

President Obama nominee Judge Patty Shwartz wrote for the panel. Just the news explained she cited the Supreme Court, which ruled "algorithms that reflect 'editorial judgments' about compiling third-party content are First Amendment-protected 'expressive product[s].'"

Latest News

© 2024 - Patriot News Alerts