Making Social Media Self-Regulation More User-Centered

Scholar argues that social media companies should be more democratic in restricting political advertisements.

In recent years, the cry of “fake news” has been ever-present, especially during election season. As more Americans use social media, providers of these interactive platforms face decisions on how to crack down on misleading and false advertisements masquerading as truth.

In a recent paper, Robert Yablon of the University of Wisconsin Law School argues that social media companies should take a more democratic approach to regulating political speech on their platforms. Yablon contrasts the approaches of Twitter, Facebook, and Google, and observes that all three claim to defend democracy, but he says that they are doing so in an undemocratic fashion by not allowing for public input into the firms’ self-regulatory decisions.

To give users greater voice in regulating political ads, Yablon proposes three possible solutions.

First, social media companies could solicit user feedback on updates to their filtering policies. Yablon envisions a process similar to notice-and-comment rulemaking, where a company would publicize proposed changes, then incorporate feedback and explain its reasoning for final decisions reached. In addition, companies could also establish independent oversight boards to “issue binding rulings on content-related disputes.” Both the company and users would be able to request rulings from such a board.

Second, Yablon suggests giving users greater ability to control what sorts of political ads they see. He lauds in principle Facebook’s plan for an ad preference tool, which would give users the ability to see fewer political ads. But Yablon worries that Facebook might bury the tool deep in users’ settings where few are likely to find it. Alternatively, he worries that such a tool may only provide a nominal decrease in political ads.

Finally, Yablon proposes that government regulation may be needed if private regulation continues to be undemocratic. A “public regulatory vacuum” exists in this area, which he attributes to constitutional concerns and government inaction. If elected officials feel pressure from their constituents, however, regulatory change may be possible. Not only would such change provide ground rules for all social media companies to follow, but it would imbue those rules with “democratic legitimacy that private regulation cannot match.”

Issues with political advertising on social media came to the forefront during the 2016 presidential election. Facebook’s CEO Mark Zuckerberg notes that his company was prepared at that time for traditional hacking and phishing but did not foresee “coordinated information operations with networks of fake accounts” from foreign actors. With another presidential election around the corner, social media companies must now decide how best to address false and misleading political ads.

The response from major social media networks has been varied. Twitter, for example, banned political advertisements altogether in 2019, stating that online ads posed “significant risks to politics.” Facebook, in contrast, permits some users to turn off political advertisements, but has emphasized how important it is “to give people a voice.” In line with this belief, it has continued to allow political advertising, and in fact exempts ads from politicians from its usual fact-checking process. Twitter and Google have also restricted how political advertisers can target certain groups of people, while Facebook has laid out only minimal restrictions.

Yablon finds this activity “encouraging,” but ultimately he sees these policies as “band-aids, not silver bullets.” Although he expresses uncertainty about the extent and implementation of these policies, his main concern is that the process through which these policies came about was undemocratic and lacking in transparency.

Yablon points out that none of these policies resulted from input from users of the social media networks. Instead, the networks “simply decreed them,” although Yablon concedes that there likely was substantial deliberation and debate within each company. Unfortunately, ordinary users have no way of knowing what went on during these internal deliberations and had no opportunity to contribute.

Furthermore, companies portray these new policies as user-friendly, but the opposite may be true. Yablon notes that these policies keep all decision-making within the hands of company officials, who then have the power to determine what users see. In general, these policies restrict users’ ability to personalize their experiences and, as Yablon suggests, they go against claims that users can express their own voices through social media. “Control,” he writes, “ultimately rests with the owners, not the users.”

Yablon also sees an inherent conflict of interest between owners and users on the topic of advertisements. Most users would probably prefer a social media experience with as few ads as possible but eliminating these ads would cut out a significant source of revenue for social media companies.

Political advertisers represent a business opportunity for these companies, and some companies have seized the chance to build relationships with political campaigns. Facebook and Google, according to Yablon, have even placed employees in major campaigns to aid in providing ad space. Due to this sort of behavior, political regulations may in the end reflect the preferences of the campaigns more than those of social media users.

These activities do not live up to the rhetoric social media companies are fond of espousing, says Yablon, who favors a more user-driven process to dispel fears of collusion.