Reconsidering Online Platforms’ Liability

To protect users’ free speech and privacy rights, scholar calls for limiting online platforms’ legal shield.

Having repeatedly requested that Grindr—an online dating app for gay men—delete nude photos posted without his consent by his ex-boyfriend, all Matthew Herrick reportedly received was Grindr’s auto-generated response: “Thank you for your report.”

In a subsequent lawsuit, a federal district court held that Grindr was not liable for user-generated content, despite Grindr being best situated to minimize harm and prevent abuse. Herrick reportedly continues to suffer from harassment, while Grindr continues to profit.

Online platforms should not be able to capitalize on users’ suffering, argues Danielle Keats Citron in a recent article. Citron, a professor at the University of Virginia Law School, identifies Section 230(c) of the Communications Decency Act as the root of the problem and calls for legislative reforms to hold online platforms accountable for how they manage their content.

Section 230(c) provides a legal shield for online platforms hosting user-generated content and consists of two parts.

Section 230(c)(1) applies when online platforms decide to carry users’ content, such as when Twitter decides not to remove posts containing hate speech. It prescribes that platforms are immune from liability for the user-generated content that they disseminate.

Conversely, Section 230(c)(2) applies when online platforms decide to remove users’ content, such as when Instagram removes any post with sexual content involving minors. Section 230(c)(2) stipulates that platforms are not liable for their decisions to take down user-generated content involving offensive or harmful materials—such as obscenity, violence, and harassment—so long as platforms do so in good faith.

Citron observes that the purpose of Section 230(c) is to encourage platforms to take action against harmful content by allowing them to filter and block such content without being held responsible. As the title of Section 230(c) indicates, it should function as a “protection for ‘Good Samaritan’ blocking and screening of offensive material,” Citron explains.

Courts’ expansive reading of Section 230(c), however, has resulted in the exact opposite, argues Citron. She observes that courts insist on interpreting Section 230(c) broadly to ensure the free exchange of ideas online, which removes incentives for online platforms to deter or address harmful content. In Doe No. 1 v. Backpage.com, for example, a federal appellate court ruled that under Section 230(c), online platforms are immune from liability even when they deliberately enhance the visibility of illegal content and protect perpetrators from identification.

Realizing that they can generate more profits from harmful content without legal consequences, online platforms capitalize on intimate privacy violation and cyber harassment, argues Citron.

Importantly, Citron underscores that marginalized groups disproportionately fall victim to intimate privacy violation and cyber harassment, which inhibits them from enjoying life as equal citizens. One report revealed that young women are most likely to self-censor to avoid online harassment. Another report found that minor victims of nonconsensual pornography suffer significant emotional distress and are particularly vulnerable to suicide.

Although some commentators contend that policymakers need not revise Section 230(c) because victims can sue their perpetrators, Citron refutes this claim by pointing out how reality prevents victims from doing so. Citron notes that, unlike corporations with deep pockets, perpetrators often lack the financial resources to recover victims’ damages. Moreover, Citron highlights that victims often find it difficult to get help from law enforcement officers due to the officers’ lack of resources and proper training.

Citron proposes to reform, but not repeal, Section 230(c).

For Section 230(c)(1), Citron argues that since policymakers never meant to provide a free pass to online platforms who benefit from the spread of harmful content, policymakers should exclude platforms that “purposefully or deliberately solicit, encourage, or keep up” materials that constitute “stalking, harassment, or intimate privacy violations” from enjoying immunity.

In addition to limiting the scope of immunity in Section 230(c)(1), Citron proposes that policymakers should require platforms to take reasonable steps to investigate and manage their content as a prerequisite for asserting immunity.

Citron lays out a few obligations that policymakers should specify as “reasonable steps,” such as requiring platforms to establish a reporting and reviewing system for intimate privacy violations and cyber harassment and requiring platforms to store certain data so victims of online abuse can access the information necessary to identify their abusers and prove their case in court.

As for Section 230(c)(2), Citron suggests that policymakers should preserve it. Citron argues that it protects users’ free speech and privacy rights by encouraging platforms, which are best situated to curtail and minimize online abuse, to take measures against harmful content. If platforms cannot take down harmful content, then hate speech, cyber harassment, and intimate privacy violations may become rampant, making platforms uninhabitable for most users, explains Citron.

Citron notes that some state lawmakers seek to eliminate Section 230(c)(2) entirely. Texas, for example, has barred online platforms from moderating content. Such laws, however, are likely to violate the compelled speech doctrine under the First Amendment, which prohibits the government from forcing private entities to express or support certain content. Policymakers should preserve Section 230(c)(2) because it protects online platforms’ First Amendment right by ensuring control over what content platforms decide to carry, argues Citron.

Furthermore, Citron argues that although the proposed reform might result in platforms moderating and removing content more frequently, the proposed reform would actually encourage more online speech and engagement, particularly from women. Citron and Jonathon Penney, a professor at York University Osgoode Hall Law School, found that people are more likely to engage in expression online and offline if they know that their privacy enjoys protection.

Citron concludes by underscoring the profound costs that online abuse exacts on victims: it undermines victims’ ability to speak and work. Although many commentators applaud Section 230(c) for facilitating an uninhibited and robust online speech environment, Citron insists that it is only when everyone is free from online abuse that everyone can be truly free to express themselves.