What You Need To Know: NPR
Mark Zuckerberg, CEO of Facebook, testified on Capitol Hill in April 2018. A plethora of inside documents known as Facebook Papers has faced the company with backlash because of its impact on society and politics. Chip Somodevilla / Getty Images Hide caption
Chip Somodevilla / Getty Images
Chip Somodevilla / Getty Images
Facebook’s commoners warned their executives of the company’s impact on society and politics in the United States – and its inability to effectively moderate content in other countries compounded those dangers. These are two of the key takeaways from the thousands of internal Facebook documents that NPR and other news outlets have reviewed.
The documents, collectively known as the Facebook Papers, were made available to Congress in redacted form after whistleblower Frances Haugen, a former Facebook product manager, disclosed them to the Securities and Exchange Commission.
Haugen claims that the body of evidence and data shows that Facebook leaders have repeatedly and knowingly placed the company’s image and profitability above the public good – even at risk of violence and other harm.
Some of the internal documents originally appeared in the Wall Street Journal last month. This includes internal research results and internal audits that the company has carried out in its own practice.
Here are four key takeaways from the news agency review of the documents:
Facebook employees were hotly debating the guidelines, especially after January 6th
When supporters of then-President Donald Trump organized an uprising in the U.S. Capitol on January 6, Facebook rushed to take technical measures aimed at containing misinformation and content that could incite further violence. The next day, she banned Trump from the platform – initially temporarily, then permanently.
In the weeks leading up to the violence, Facebook worked to defuse vitriol and conspiracy theories from Trump voters who refused to accept its defeat. As reported by NPR’s Shannon Bond and Bobby Allyn, the company has repeatedly closed groups associated with the stop-the-steal movement. But these groups attracted hundreds of thousands of users, and Facebook couldn’t keep up when the conspiracy theorists reformed.
The post-election turmoil quickly put an end to the relief many on Facebook felt on November 3, as the US elections were largely peaceful with no foreign interference.
But then came January 6th – and when the US Capitol attack captivated and horrified audiences in the US and elsewhere, Facebook employees vented their frustration and anger.
“We have stoked this fire for a long time and shouldn’t be surprised that it is now getting out of control,” wrote an employee on an internal message board, as the documents show.
“Hold on to everyone,” wrote Mike Schroepfer, Facebook’s chief technology officer, on a message board, calling for calm as he documented the company’s approach to the uprising.
On Schroepfer’s message, Facebook employees replied that it was too little too late.
“I came here to make change and improve society, but all I’ve seen is stunting and denial,” said one commentator, according to the documents.
In a statement to NPR, Facebook spokesman Andy Stone said Facebook was not responsible for the siege of the Capitol.
“Responsibility for the January 6th violence rests with those who attacked and encouraged our Capitol,” Stone said.
Content standards have been skewed, often for fear of upsetting high profile accounts
One of the earliest revelations from the internal documents is the details they provide about Facebook’s separate content standards for high profile accounts like those for Trump or for celebrities.
During Trump’s presidency, he regularly made false and inflammatory statements on a variety of topics. But only a small handful were removed from Facebook when the then president made dangerous claims such as that COVID-19 was less dangerous than the flu or that children were “almost immune to the disease”.
Facebook previously defended its approach to such controversial and misleading statements, saying that politicians like Trump should be allowed to say what they believe so the public will know what they think. Facebook boss Mark Zuckerberg has also repeatedly emphasized that Facebook is only a platform, not the “arbiter of the truth”.
But the documents suggest that Facebook’s policy of treating influential people differently – codified in a VIP system called XCheck – was in large part created to prevent public relations backlash from celebrities and other high profile users.
The overall premise of the XCheck system, Jeff Horwitz of the Journal NPR said in September, is “not to involve anyone publicly with someone influential enough to harm you.”
Facebook’s own board of directors harshly criticized the program last week, saying the company was not open enough about its different standards for moderating content.
A Facebook spokesperson told NPR in a statement that the company asked the board to review the program as it aims to “be clearer in our statements for the future.”
Young people perceive Facebook content as “boring, misleading and negative”
For much of the past decade, seniors have been the fastest growing demographic in the United States on Facebook – a dramatic turnaround for a company whose founding mysticism is based on the image of a hoodie-wearing programmer creating a space for college kids to come and see associate.
Over the same period, Facebook found that younger people were less likely to join the site. It’s a worrying trend for the company – Facebook insiders held an internal presentation this year to find out about this trend, which is reflected in the documents.
“Most young adults perceive Facebook as a place for people between 40 and 50,” said the company’s researchers, according to The Verge. “Young adults find content boring, misleading and negative. You often have to get over irrelevant content in order to achieve the essentials. “
Along with this stumbling block, it was found that young users had negative views about Facebook because of privacy concerns and its potential “impact on their well-being,” reports The Verge.
Haugen had previously published a Facebook study that found that 13.5% of UK teenage girls in a survey said their suicidal thoughts became more common after joining Instagram.
In addition to the platform of the same name, Facebook owns Instagram and WhatsApp.
“It is clear that Facebook puts profit above the welfare of the children and all users,” Senator Marsha Blackburn, R-Tenn., Said during a Senate hearing earlier this month in the Haugen testimony.
Facebook’s global reach exceeds its reach
While much of the focus on Facebook in the US has to do with its role in facilitating and intensifying political divisions, the documents also blame the company for its activities in numerous other countries.
The documents show that Facebook cannot cope with a number of social and linguistic complexities due to its more than 2.8 billion users worldwide. The results were particularly dangerous and damaging in countries where civil unrest or legal violations are widespread, the documents say.
“Two years ago, Apple threatened to remove Facebook and Instagram from its app store because of concerns about the platform being used as a tool for trading and selling maids in the Middle East,” reports The Associated Press.
The company routinely struggles with posts and comments in Arabic, both on its main platform and on Instagram, according to the documents. Arabic is one of the most widely spoken languages in the world, but its many dialects are very different from one another.
Facebook “has no one who speaks most of them or understands most of them in everyday language,” Horwitz told NPR. “And it doesn’t have a system to get content in these dialects to the right people.”
The problem goes beyond Arabic and has multiple implications.
“In countries like Afghanistan and Myanmar, these loopholes have led to inflammatory language flourishing on the platform,” reports the AP, “while Facebook suppresses normal language and bans general words across the board in Syria and the Palestinian territories.”
When similar stories surfaced over the weekend about India and Ethiopia, Facebook stated that more than 40,000 people are “working on safety, including global content review teams on over 20 websites around the world reviewing content in over 70 languages” .
Editor’s note: Facebook is one of NPR’s recent financial supporters.