Legislators praised stricter regulations for social media platforms to combat misinformation
House lawmakers promised tighter regulations on social media platforms are now inevitable as CEOs of Facebook, Twitter and Google faced intense scrutiny by Democrats and Republicans, as they did at a hearing on Thursday. The hearing aimed to address misinformation that was spreading on the social media platforms that contributed to the.
Democratic Congressman Frank Pallone Jr., chairman of the House of Representatives’ Energy and Trade Committee (ECC), said Facebook, Twitter and Google “played a role in instigating riots,” accusing the platforms of being extremists who spread misinformation, to hand over a “megaphone”.
“Your business model itself has become a problem and the time for self-regulation is over,” said Pallone Jr .. “It is time we passed law and held you accountable and we will.”
Pennsylvania Democratic Congressman Michael Doyle, chair of the Commerce and Technology subcommittee, told Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Google’s Sundar Pichai that their companies “don’t protect users from the effects of their creations,” and their platforms do so Responsible are the deadly unrest in the US Capitol in January.
Bill Clark-Pool / Getty Images
“This attack and the movement that motivated it started and was nurtured on your platform,” said Doyle. “Their platforms have suggested groups people can join, videos they should watch, and posts they should like to help drive this movement with terrifying speed and efficiency.”
Doyle said lawmakers wanted to hold companies accountable and called for an “audit authority” on their technologies. “We’re going to legislate to stop this. The stakes are just too high,” said Doyle.
Attack on the US Capitol
Florida Republican Congressman Bill Bilirakis reiterated similar views, saying the committee knows “how to get things done when we get together”.
“We can do this with you or without you and we will,” Bilirakis said of the law regulating businesses.
Legislators on both sides asked the three CEOs if they should be held responsible for their roles during the January 6 attacks on the U.S. Capitol, but all three avoided the question.
Facebook’s Mark Zuckerberg said the people who were involved in the uprising should be responsible.
“President Trump gave a speech in which he rejected the results and called on people to fight,” said Zuckerberg. “I believe that the former president should be responsible for what he said and that the people who broke the law should be responsible for their actions.”
Zuckerberg conceded that not every misinformation leading to the attack on the Capitol was caught, but argued that the company made its services “inhospitable to those who could do harm”.
Pallone Jr. and Republican Congressman Cathy McMorris Rodgers both criticized social media companies’ use of algorithms to curate content and suggest posts for users to interact with.
“The dirty truth is that they rely on algorithms to purposely promote conspiratorial, divisive, or extremist content so they can take more money in advertising dollars,” said Pallone Jr. “That’s because these companies are all the more engaging and get views from their users, the more outrageous and extremist the content, and the more views mean the more money. “
McMorris Rodgers said the algorithms used by social platforms are detrimental to children’s mental health, adding that she doesn’t want artificial intelligence to manipulate children.
Zuckerberg fired back, saying the claims that algorithms feed users with content to make them angry are not true. “The division we are seeing today is primarily the result of a political and media environment that is driving the Americans apart, and we must reckon with that,” said Zuckerberg.
The head of Facebook also suggested some changes that he would like to see in Section 230 of the Communications Decency Act of 1996. The federal permit provides online platforms immunity from liability for content that others post on their websites.
Zuckerberg said it was important that the entire law not be repealed, but urged major platforms to post regular reports on each category of malicious content and how effectively they remove it. He cautioned that changes to section 230 could have different effects on smaller platforms that do not have the same resources as Facebook for moderating content.
“It would be sensible to link the immunity for the larger platforms to a generally effective system for moderating clearly illegal content,” said Zuckerberg.
Before the hearing, all three companies tried to highlight the work they have done over the past few months to curb the spread of misinformation and harmful content on their websites.
Google said it removed 850,000 videos from YouTube that are dangerous or misleadingmedical information and blocked nearly 100 million COVID-related ads in 2020.
Facebook noted that it has referred billions of users to authoritative public health and electoral security sources. A Facebook spokesperson told CBS News the company only removed 2 million posts containing misinformation about COVID-19 in February.
And Twitter stressed that it removed more than 22,000 tweets and challenged nearly 12 million accounts worldwide for posting misinformation related to COVID-19.