Expert urges regulators to change incentives for social media to create a healthier digital sphere.
Propaganda, misinformation, incitements to violence: content within these categories can spread rapidly on social media and hurt individuals, groups, and even democracy. To give social media companies incentives to remove harmful content, some might suggest narrowing the kind of speech that the First Amendment protects. But Yale Law professor Jack Balkin urges a different solution to the social ills on social media: combating surveillance capitalism.
Surveillance capitalism refers to companies tracking and gathering data about their users to target consumers, keep them online for advertisers, and extract profit. According to Balkin, the digital business models underlying surveillance capitalism harm the health of the digital public sphere. He argues that legal changes need to adjust the incentives of social media companies in order to foster a well-functioning public sphere.
Balkin explains that private institutions have long played a role in moderating discourse in the public square. Movie studio executives, book publishers, and newspaper editors previously decided which content to broadcast or restrict. They curated content, evaluating its quality and civility before publishing it.
But now the public sphere has moved online, and Balkin contends that social media companies fail to take responsibility for curating content in a way that promotes civil discourse and protects democracy.
Instead, surveillance capitalism has led social media companies to pursue private profit at the expense of potential responsibilities to the public. Their economic incentive to maximize ad revenue motivates them to prioritize end-user engagement. To keep users absorbed and attentive, companies promote engaging material—even if it is false, misleading, incites violence, or destabilizes democracies. Altogether, these digital practices can lead to harms such as political unrest and ethnic violence.
If social media companies are not forced to internalize these socially harmful costs, Balkin argues, they will prioritize profit-maximizing activities and underinvest in content moderation that does not lead directly to profits. Companies also lack any real incentive to educate end users about their data collection and use practices.
Balkin urges regulators to change this calculus and improve social media by reforming competition law, privacy law, and consumer protection law. He argues that the law should push companies to adhere to new, professional norms that account for their role in influencing public discourse and behavior.
Changes in competition law can encourage more institutions and actors to moderate content or to find different ways to do so. Current antitrust laws and other laws that regulate competition enable a handful of social media companies to dominate the digital public sphere. Such companies can retain their users without changing their content moderation policies, even when they allow harmful content to spread. Balkin suggests that changing laws to promote competition in the social media marketplace can help a range of smaller companies survive—and in turn, the public may be less vulnerable to decisions made by a few large, powerful companies.
Balkin acknowledges that this approach might have negative effects, such as fragmenting the public sphere. Still, he argues that competition law can promote plurality, which could push companies to innovate and implement new types of content moderation to make their services more attractive to users.
In a world with many social media companies, the government would need to require interoperability among social media networks, Balkin explains. Users who log into one social media network should be able to connect with users of other social media networks.
Balkin also recommends changing intellectual property law and the Consumer Fraud and Abuse Act to allow developers to collect information and features from different social media networks in one place. This would allow people who join different social media services to communicate across platforms.
Regulators should also change privacy and consumer protection law to clarify how companies can collect and use data, Balkin advises, as well as how companies can use algorithms to capture user attention for advertising profits. Currently, companies can externalize the social costs of their activities onto their users and the public. Balkin argues that laws should push social media companies to internalize those costs and make it more expensive for them to engage in surveillance capitalism.
According to Balkin, updating privacy and consumer protection law can attack companies’ reliance on surveillance capitalism for profit and push companies to update their business models to comply with new laws. This update would impose limits on companies that commodify their users as a source of data to sell to advertisers. It would also make companies internalize professional norms and public-regarding behavior.
In the end, Balkin ties some suggestions back to content moderation. He characterizes Section 230 of the Telecommunications Act as a regulatory subsidy that spares platforms from liability for speech they host, and lowers the costs of repeated litigation over content moderation policies. Balkin recommends limiting that immunity for social media companies by making it conditional on whether they accept a new set of public interest obligations for the digital age.
Fixing public discourse online will require fixing the digital sphere overall, Balkin concludes, by addressing the surveillance capitalism that underlies it.