Op-Ed: Bring Social Media Enforcement Into the Light

Published in Barron’s on May 5, 2022

 

Elon Musk’s sudden deal to buy Twitter has his fans elated, and his detractors terrified, by signs that he will throw the platform’s rules out the window.

No wonder. Bold, ruthless, and unpredictable, he tweets to his nearly 100 million followers as if there weren’t any rules. Twitter TWTR +2.75%  hasn’t called Musk to account—though the Securities and Exchange Commission and the National Labor Relations Board have. After he tweeted in August 2018 that he was ready to take Tesla private and had lined up the necessary financing, the SEC sued him, saying it was false. (He settled). And the NLRB demanded in March 2021 that Musk delete a 2018 tweet for union-busting (He did).

Right after his deal to buy Twitter for $44 billion was announced on April 25, Musk retweeted far-right criticism of two of the company’s executives, and many of his followers piled on. Vijaya Gadde, Twitter’s general counsel who is also its chief of content moderation, got a stream of threats and abuse, much of it related to her Indian background.

Musk had broken informal norms of C-suite behavior, it seems.  Dick Costolo, who was CEO of Twitter from 2010 to 2015, tweeted at him, “What’s going on? You’re making an executive at the company you just bought the target of harassment and threats.” But Musk still had not violated any rules of Twitter, which Costolo proudly referred to as “the free speech wing of the free speech party” a decade ago during his tenure.

In subsequent years, Twitter was criticized for allowing vicious harassment and other harmful content to flourish on its platform, and the company began removing more tweets and accounts. Its staff wrote and repeatedly revised rules that prohibit forms of speech including abuse, harassment, and what it calls hateful conduct, though Twitter is still less restrictive than other social media platforms such as Facebook.

Now Musk has suggested that he wants to make Twitter even more unfettered. “Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square where matters vital to the future of humanity are debated,” he said in a statement when the deal was announced.

There’s nothing to stop him, if he buys the company as planned. But the only thing novel about that is Musk himself. One person can already dictate the rules at the world’s biggest social media company, Meta. Since Mark Zuckerberg controls a majority of the voting shares, he can make and change the rules for Facebook (with nearly three billion users), Instagram, and WhatsApp, at whim.

It’s a tremendous amount of power over public discourse in a few unelected hands, and though Musk might tell himself that he would cede power by abolishing the rules, that’s wrong. Some people—Musk, for instance—would have much greater power than others in an online free-for-all. That’s already the case, and the differential would surely grow.

In fact private companies now effectively govern more human communication than any government does, or ever has. (This is true of Facebook alone.) But governments should not make the rules for digital discourse either. Think of how unfairly many governments would do it. Countries like TurkeySaudi Arabia, and India already leverage domestic laws and takedown requests to censor opposition and minority opinions on social media.

Instead, now that this industry has so much capacity to control and influence the fundamental human activity that is communication, it should be subject to auditing of how it enforces the rules. Enforcement—or content moderation as they call it in Silicon Valley—constitutes sausage-making at social media platforms, and it is now done almost entirely in the dark.

Users (and other members of the public) can’t tell where Twitter and other platforms actually draw the line between what they regard as prohibited and permitted speech in categories like abuse, harassment, or hate speech, or whether they enforce their rules equitably.

Outsiders learn the details of enforcement in only a minuscule proportion of the millions of takedown decisions that companies’ software and moderators make every day. Usually it’s when there’s a public controversy over a specific post. For example when a Norwegian writer posted a famous image of a Vietnamese girl running with napalm burning into her skin in 2016, Facebook removed the post under its rule against nudity. After protests from influential people including Norwegian Prime Minister Erna Solberg, Facebook reversed its decision on that photograph. But the company didn’t explain how it would draw the line in similar cases. “While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others,” a Facebook statement said at the time.

There is one small but promising experiment in allowing outsiders to peek into the black box that is enforcement, and even to have some authority over it: the Oversight Board that Facebook created in 2018, to review and if it chooses, to overrule, some of its content moderation decisions. The board has 20 members from many countries where Facebook operates, who review a few dozen cases per year in total. It is not empowered to rewrite rules, nor to audit enforcement at scale, though.

Enforcement auditors would be different. They would examine, vitally, whether the system is fair. For example, are the same sorts of posts taken down at the same rate when put up by members of different groups, say women and men, or Indians and Pakistanis? We don’t know those answers now for Facebook, Twitter, or any other social media platform, since outside researchers don’t have access to the information needed to answer such questions. Professional auditors with the necessary technical skills should be given access to the relevant data, under extremely secure, privacy-protecting conditions. They would publish regular reports, again strictly protecting user privacy.

Musk has caused a burst of public and policymaking attention to social media platform rules. Instead of speculating fruitlessly about his next move, we should take the opportunity to require audits of platform rule enforcement. Policing of town squares shouldn’t take place in the dark, after all.