Type to search

Social Media

Facebook whistleblower says transparency is needed to fix social media issues

Share

Former Facebook employee and critic Frances Haugen answers questions during a hearing of the Communication and Technology Subcommittee of the US House of Representatives on Energy and Trade on Capitol Hill in Washington, USA, December 1, 2021. REUTERS / Elizabeth Frantz

Register now for FREE unlimited access to Reuters.com

to register

Dec 3 (Reuters) – A deeper investigation into the lack of controls on Facebook to prevent misinformation and abuse in languages ​​other than English is likely to make people “even more shocked” at the possible harm the social media company is doing caused, the whistleblower Frances Haugen told Reuters.

Haugen, a former product manager on Facebook from Meta Platforms Inc (FB.O), spoke at the Reuters Next conference on Friday.

She left the company in May with thousands of internal documents leaked to the Wall Street Journal. This resulted in a series of articles in September detailing how the company knew its apps helped spread divisive content and harm the mental health of some young users.

Register now for FREE unlimited access to Reuters.com

to register

According to internal documents and Reuters interviews with former employees, Facebook also knew that it had too few employees with the language skills to identify objectionable posts from users in a number of developing countries. Continue reading

People who use the platform in languages ​​other than English are using a “raw, dangerous version of Facebook,” said Haugen.

Facebook has stated time and time again that it disagrees with Haugen’s characterization of the internal research and takes pride in the work it has done to stop the abuse on the platform.

Haugen said the company should be required to disclose which languages ​​are supported by its technical security systems, otherwise “Facebook will … do what is necessary to minimize the PR risk,” she said.

The internal Facebook documents published by Haugen have also raised new concerns that it may have failed to take action to prevent the spread of misleading information.

Haugen said the social media company knew it could introduce “strategic points of friction” to slow users down before re-sharing posts, for example by requiring users to click a link before they can share the content. However, she said the company avoided taking such action in order to preserve the profit.

Such measures, prompting users to reconsider sharing certain content, could be helpful because there are many risks involved in allowing technology platforms or governments to determine which information is true, according to internet and legal experts, during a separate Friday night Panels at the Reuters Next conference spoke.

“By regulating language, you are giving states the power to manipulate language for their own ends,” said David Greene, director of civil liberties at the Electronic Frontier Foundation.

The documents published by Haugen have led to a number of hearings in the US Congress. Adam Mosseri, head of Meta Platforms’ Instagram app, will testify next week about the app’s effect on young people.

When asked what she would say to Mosseri on the occasion, Haugen said she would ask why the company had not published more of its internal research.

“We now have evidence that Facebook has known it harmed children for years,” she said. “How should we trust you in the future?”

To watch the Reuters Next conference, please register here https://reutersevents.com/events/next/

Register now for FREE unlimited access to Reuters.com

to register

Reporting by Sheila Dang in Dallas Editing by Matthew Lewis

Our Standards: The Thomson Reuters Trust Principles.

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *