Inside Facebook’s Leaked Documents: What Do They Reveal? by E Jen Liu
L
ast month, former product manager Frances Haugen leaked the Facebook papers—containing an array of internal employee discussions, presentation slides, and memos—to the Securities and Exchange Commission and the media. The documents reveal internal research condemning Facebook’s impacts on mental health, failures to contain hate speech and misinformation, and the existence of a crosscheck system that privileges high-profile accounts. Haugen’s complaint exposed that the media conglomerate repeatedly prioritized profit over public safety, severely undermining its mission to bring the world closer together. In our increasingly competitive attention economy, where time and mental focus are scarce commodities, Facebook is being crowded out by competitors like TikTok and Twitter. Generation Z consumers, who are more politically active than their predecessors, harbor val-
id concerns regarding privacy and demand greater corporate responsibility. Instead of recognizing its decline as a reckoning, Facebook continues to downplay its social impacts. The leaked files unveiled extensive internal research—previously withheld from
the public and lawmakers—demonstrating the negative mental health impacts of Facebook on its users. Slides from internal presentations concluded that Facebook makes body image issues worse for one in three teen girls, and many surveyed teens blamed THE BULLETIN
-
34
-
NOV/DEC 2021
Instagram for increased rates of anxiety and depression. Legislators will likely use these documentss to support their campaign against Facebook’s plan to launch Instagram Youth for users under 13. Facebook lacks the human resources and technology to contain the spread of hate speech and misinformation on its platforms. In Ethiopia, an ethnic Amhara militia group used Facebook to fundraise and recruit new members—its efforts left unthwarted due to the platform’s inability to detect hate speech in Amharic and Oromo—leading to the ethnic cleansing of Tigrayans last year. In India, a dummy account created by Facebook employees to understand user exper i e n c e — a c t i ve during the recent surge in violence in Kashmir over territorial disputes between India and Pakistan—was flooded with anti-Muslim propoganda and photos of dead bodies. In the Philippines, Facebook products are used in labor trafficking—as described in an