Type to search

Effects

The Facebook files – WSJ

Share

Sept. 15, 2021 9:37 a.m. ET

Facebook Inc. knows full well that its platforms are full of bugs that do harm, often in ways that only the company fully understands. This is the central result of a Wall Street Journal series based on a review of internal Facebook documents, including research reports, online appraisal interviews and draft presentations to senior management.

Time and again, the documents show that Facebook researchers have identified the platform’s negative impact. Despite congress hearings, its own commitments and numerous media exposés, the company has repeatedly not rectified these. The documents provide perhaps the clearest picture yet of how well known Facebook’s problems are within the company, right down to the CEO himself.

_01 | Facebook says its rules apply to everyone. Company documents reveal a secret elite who are exempt

By Jeff Horwitz

Mark Zuckerberg said Facebook allows its users to speak on an equal footing with the political, cultural and journalistic elites and that its standards apply to everyone. Privately, the company has built a system that exempted high profile users from some or all of the rules. The program known as “Cross Check” or “XCheck” was intended as a quality control measure for high-profile accounts. Today it protects millions of VIPs from normal corporate enforcement, as the documents show. Many abuse this privilege by posting material such as harassment and incitement to violence, which would normally result in sanctions. Facebook says the criticism of the program is fair, it was developed for a good cause, and the company is working to fix it. (Listen to a related podcast.)

Next story →

_02 | Facebook knows Instagram is toxic to many teenage girls, company documents show

By Georgia Wells, Jeff Horwitz, and Deepa Seetharaman

Researchers at Instagram, owned by Facebook, have been studying how its photo-sharing app is affecting millions of young users for years. Repeatedly, the company found that Instagram is harmful to a sizable percentage of them, especially teenage girls, more than any other social media platform. In public, Facebook has consistently downplayed the app’s negative effects, including in comments to Congress, and has neither made its research public nor made it available to scientists or legislators who have asked for it. In response, Facebook says the negative effects are not widespread, that mental health research is valuable, and that some of the harmful aspects are not easy to address. (Listen to a related podcast.)

Next story →

_03 | Facebook tried to make its platform a healthier place. Instead, it got angrier.

By Keach Hagey and Jeff Horwitz

Facebook made an announced change to its algorithm in 2018 to improve its platform – and stop signs of declining user engagement. Mr Zuckerberg stated that his goal is to strengthen bonds between users and improve their wellbeing by encouraging interaction between friends and family. Within the company, employees warned, as the documents show, that the change would have the opposite effect. It made Facebook and those who used it angrier. Mr Zuckerberg resisted some of the fixes suggested by his team, the documents show, because he feared they would make people less interact with Facebook. In response, Facebook says that any algorithm can promote inappropriate or harmful content and that the company is doing its best to mitigate the problem.

Next story →

Copyright © 2021 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *