Facebook is trying to dampen politics on its platform
said it is starting to reduce the amount of political content users see on their newsfeeds, potentially diminishing the role of the world’s largest social network in elections and civil discourse in general.
The announcement, made in a blog post on Wednesday, follows Facebook CEO Mark Zuckerberg’s statement on the company’s earnings call last month that most users wanted to see less political content. He said at the time that reducing policies would allow Facebook to “do a better job, bring people together and promote healthier communities”.
Facebook says political content is currently only 6% of what people see on the platform. It will begin experimenting immediately to reduce that amount for a small percentage of people in Canada, Brazil, and Indonesia, with testing in the US in the coming weeks.
The company said it is not deleting political content but is looking for ways to reduce exposure for users who prefer not to see it. In practice, this means that Facebook will still allow users to post about politics and argue among friends, but its algorithms will prioritize these conversations less and spread them less widely on the network, especially for people who are not interested in these topics have stated.
The company has not indicated how it will define political content.
Facebook said Wednesday that its new efforts will be gradual and will accompany existing tools – like the ability to opt out of political ads or guarantee that content from select entities will be featured high in the news feed – that the platform is already offering users .
The effort marks a linchpin for Facebook, which has historically enjoyed its role as a central, populist actor in elections and social movements around the world. In a speech from October 2019, Mr. Zuckerberg declared social media to be the “fifth estate”, a center of civil power on par with the press as well as the executive, legislative and judicial branches.
Mr. Zuckerberg said on the company’s conference call that he was reconsidering the place of politics on the platform as part of his ongoing efforts to “bring the temperature down and discourage divisive conversations and communities.” The postponement comes after a painful US presidential election, which twice prompted Facebook to take so-called emergency measures for “glass breakage” in order to calm the bourgeois discourse. The first was shortly after the November election and the second was when Trump supporters stormed the U.S. Capitol on Jan. 6.
These contingency measures were designed as temporary measures, but some, such as the restrictions on how fast certain Facebook groups can grow, are now permanent.
The Wall Street Journal previously reported that pre-election Facebook internal research concluded that the most active Facebook political groups produced a toxic concoction of hate speech, conspiracy theories and calls to violence.
Depending on how far the Facebook overhaul goes, reducing the visibility of political content could disrupt the ecosystem of online activism, publishing, and advertising that has grown around the social network, with its reported 2.8 billion monthly users. It could also revive complaints, mostly from Conservatives, that the company is stifling political speech.
Big Tech’s deplatformation of former President Donald Trump has sparked a debate about the future of content moderation on social media. WSJ speaks to a disinformation and moderation expert about what’s next.
The ultimate meaning of political content will depend on testing and user feedback, the company says.
“Our goal is to give people the opportunity to find and interact with political content on Facebook, while respecting the appetite of every person who is at the top of their news feed,” reads the blog post by Facebook product manager Aastha Gupta.
Facebook said it would exempt health information from organizations it considers authoritative, as well as content from official government agencies, from testing.
This move isn’t the first time Facebook has tinkered with the importance of certain categories of content on its platforms. After Facebook geared its content recommendation algorithms to increasing the time users spent on the platform or responded to content, Facebook gradually shifted to maximizing a new metric in the second half of the last decade: “meaningful social interaction” .
The result has been a decreased awareness of passively consumed content – posts from companies, brands, and the media, according to Mr. Zuckerberg – and a greater focus on material shared by a user’s family, friends, and stakeholders. Outside researchers and the company’s researchers found that for many newspaper publishers, the changes meant less traffic, but more attention to stories, which generated a strong response from users.
The Journal reported last year that internal research showed that Facebook’s algorithms reward providers of polarizing content that evoke an emotional response in users.
Ms. Gupta also said the company will publicly discuss the changes once they are made.
“As we begin this work, we will share what we are learning and which approaches are most promising,” she wrote.
Facebook faces other hot political issues in the near future. Its new independent content oversight body is due later this year to determine if Facebook made a mistake in suspending former President Donald Trump from its platform.
Write to Jeff Horwitz at Jeff.Horwitz@wsj.com
Copyright © 2021 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8