麻豆果冻传媒

In Short

Meta’s Missteps

A New Database Offers an Unprecedented View into the Company's Reactive, Under-Resourced Moderation Practices

Screenshot 2023-06-15 171823

The PR headaches keep coming for Meta, the company formerly known as Facebook. Last week, a Kenyan court the company to offer mental health care to a group of Facebook moderators who sued the company for union busting. As part of their suit, the moderators cited poor working conditions, low pay, and routine exposure to disturbing content. The case came on the heels of another lawsuit filed by a Kenyan moderator in 2022, who Meta, via its staffing partner, of human trafficking. The suit claimed that Meta鈥檚 subcontractor lured employees to Kenya under false pretenses by posting job ads without mentioning that they would be working as Facebook moderators and with graphic material.

Meta鈥檚 Kenyan troubles reflect the company鈥檚 larger struggle to govern platforms like Facebook, Instagram, Messenger, and Whatsapp that, in aggregate, are used by . Tools for combating disinformation, hate speech, illegal content, and targeted information operations may work in one country, but fail in another due to cultural nuance or governmental interference, while effective moderation in languages other than English has been a chronic challenge for the company. What鈥檚 more, Meta insiders know about these shortcomings, but have struggled to reconcile them against the business imperative to expand and monetize content.

We know this thanks to a 2021 leak of internal documents by Facebook whistleblower . While working as a product manager, Haugen painstakingly photographed documents on her phone, and then the files to the Securities and Exchange Commission in the summer of 2021. The trove of some 20,000 screenshots gave the world an inside look at how the social media sausage is made and moderated at Facebook. However, the format of the files made them difficult to search, and they also contained personally identifying information of Facebook employees. For these reasons, Haugen entrusted a select group of reporters with the task of sifting through the materials while safeguarding against the harms associated with a mass, unredacted release. The result was an explosion of coverage of Facebook鈥檚 role in events ranging from the to ethnic conflicts in , , and the . Days after the first wave of articles came out in October 2021, Facebook its rebranding as Meta.

However, not everything in the documents was covered by the media, and many details about how Facebook deals with hate speech, illegal content, and mis/disinformation on its platforms remain unreported. To address this gap, Harvard鈥檚 Public Interest Tech Lab is launching a platform called later this summer, which will allow the public to explore Haugen鈥檚 documents for themselves. Once open to the public, FBarchive will offer researchers, policymakers, and the curious an unprecedented look behind the curtain of one of the world鈥檚 largest and most consequential companies.

Shop Talk

Many of the Haugen documents consist of conversations among Meta employees on the company鈥檚 internal Workplace platform, a discussion board that closely resembles its public Facebook product. Organized and anonymized within FBarchive, the screenshots show employees discussing company culture and policies with a striking degree of frankness. For example, in reply to a post by then-CTO Mike Schroepfer asking 鈥渨hat鈥檚 slowing you down?鈥, one worker wrote 鈥淲e perpetually need something to fail – often fucking spectacularly – to drive interest in fixing it, because we reward heroes more than we reward the people who prevent a need for heroism.鈥 The post received nearly 900 鈥渓ikes鈥 and sparked pages of discussion.

The documents reveal shortcomings in Meta鈥檚 detection and moderation tools, especially outside the United States. Some countries are identified as priority risks. One 2021 slideshow points out that 鈥渓imited localization,鈥 lack of language classifiers other than English, and insufficient 鈥渉uman support鈥 (such as the Kenyan moderators who sued the company) make 鈥渕ost of our integrity systems鈥 much less effective outside the US. The documents highlight countries like Ethiopia and Myanmar as high-risk environments where Meta struggles to detect and counter harmful speech that has fueled violence and harassment in recent years.

The leaks show that, while Meta employs a range of detection tools for sniffing out harmful and coordinated activity on its platforms, these are far from foolproof. For example, a 2018 discussion hypothesizes a potential link between and Russian information operations, with one participant remarking that 鈥淚f these actors don鈥檛 collaborate directly via any of our services there is almost nothing we can do to prove these relationships鈥 [sic]. Several documents from 2020 note that the company鈥檚 ability to detect misinformation and foreign intelligence operations on Instagram is 鈥渟till nascent.鈥

Meta's Internal Critics

Why does Meta keep fumbling the ball when it comes to protecting its users? The leaks offer several reasons. First, as already noted, Meta employees often blame the company鈥檚 culture of reacting to crises rather than focusing on prevention. As one engineer writes, 鈥溾楤etter engineering鈥 at facebook is making something poorly, then coming back to fix it later鈥 [sic]. The company acts swiftly in response to technological failures or bad press, but does not reward the unglamorous work of anticipating problems before they escalate.

Second, Meta鈥檚 short-term business goals clash with what its employees think would best serve the company in the long run, as well as reduce social harm. One of the documents is from , a Facebook data scientist who left the company in 2020. Zhang claims that she was the main employee in charge of finding and fighting government-backed information operations and that she personally made 鈥渄ecisions that affected national presidents鈥 and 鈥渟o many prominent politicians globally that I鈥檝e lost count.鈥 Zhang did this work on top of her core duties. When she asked for more support, she was told that the company could not spare the resources. In short, Meta does not invest enough in solving problems鈥攗ntil it鈥檚 too late.

A third theme, which overlaps with both of the previous issues, is Meta鈥檚 tendency to prioritize problems that can be quantified over those that cannot. Several employees note that this stems from a Silicon Valley-wide obsession with data and measurable change, but in practice means that workers are incentivized to squash bugs rather than prevent them in the first place.

No Quick Fix

Meta鈥檚 highly public missteps have driven in Facebook鈥檚 brand reputation among US adults, yet to some extent the hurdles the company has faced are inherent to operating the world鈥檚 largest social platforms. No matter how many resources Meta throws at moderation problems, some abuse of its products is probably inevitable. To Meta鈥檚 credit, it has attempted to address past mistakes, creating among other measures an independent Oversight Board to advise on moderation policies. The documents in Harvard鈥檚 FBarchive show that many company insiders are well-intentioned people puzzling through some very thorny problems. A or it is not.

While we can鈥檛 expect perfection, we can expect Meta to proactively ameliorate harms on its platforms, but that requires anticipating problems in advance. This, in turn, demands a robust dialog between Meta and its users, regulators, researchers, civil society groups, and other stakeholders who have an interest in the health and safety of our shared digital spaces. Solutions will be as nuanced and international as the problems, and to that end resources like FBarchive that shed light on Meta鈥檚 inner workings will be invaluable.

More 麻豆果冻传媒 the Authors

Programs/Projects/Initiatives

Meta’s Missteps