Type to search

Effects

Legislators see 2022 as the year in which social media will be restricted. Others fear that politics will get in the way

Share

The momentum of state social media regulation in 2022 appears to be developing, with the public supporting the effort and legislators optimistic about progress.

For a new one opinion poll According to Morning Consult, 56 percent of US adults said they support government regulation of social media companies – up 4 percentage points from that an October poll.

And public opinion is negative about social media companies’ efforts to improve user safety: 61 percent of the public said platforms aren’t doing enough to keep users safe, compared to just 21 percent who said that they do enough.

However, industry observers warned of a bumpy road to go as partisan squabbles could undo regulatory efforts. That could mean that even if the legislature agrees that something needs to be done at national level, it could be prevented as they disagree on the fundamental issues and how to resolve them.

“I think if there is legislation it will be very broad to try to compromise and bridge this really big gap between the two parties,” said Ashley Johnson, senior policy analyst at the Information Technology and Innovation Foundation . “And I think that if there is very broad legislation, the internet is likely to suffer.”

Content moderation: too much or too little?

Before regulations can be pushed forward in Congress, there remains an important sticking point between the two parties on social media: How to approach content moderation.

The Democrats, for the most part, believe that social media companies should do more to remove dangerous content, including misinformation and posts promoting violence and harassment. Through the legislation, the Democrats believe they can get social media platforms to clean up their content.

“A few years ago, the debate focused almost entirely on content moderation,” said Rep. Tom Malinowski (DN.J.), sponsor of legislation Amending Section 230 of the Communications Decency Act, which protects websites from liability for user-posted content, said in an interview. “Should content be deleted? Is that censorship? Where do you draw the line? ‘ And of course there are still people who worry about it, especially on the Republican side. But when we look at regulation, we focus more and more on the underlying design of social networks, on how extremism and misinformation spreads. “

Republicans, meanwhile, claim conservative voices and viewpoints are being censored on social media. If the Democrats do not agree with this view, cooperation will not be possible.

“Republicans are fighting for free speech while Democrats continue to push for more censorship and control,” Rep. Cathy McMorris Rodgers (R-Wash.), Senior member of the House of Representatives’ energy and trade committee, said in an email. “Bipartisan will not be possible until Democrats agree that we need less censorship, no more. Our hope is that Democrats will choose to defend this fundamental principle and give up their censorship requests. Only then can we come together to hold Big Tech accountable. “

Despite the apparent disagreement over the issues at stake, several lawmakers welcomed the bipartisan desire to reach agreement on regulations, particularly after two subcommittee hearings in the House Energy and Trade Committee. A hearing before the Communications and Technology Subcommittee discussed removing the protection of social media companies from liability for user-posted content, while the Consumer and Commerce Subcommittee discussed other reforms that it would help build a “safer internet”.

“There is some overlap; Both parties want these platforms to take some responsibility for their own actions, “Communications and Technology Subcommittee Chair Mike Doyle (D-Pa.) Said in an email.

And lawmakers agreed that the testimony of former Facebook Inc. employee Frances Haugen and the revelations in the Facebook files and papers required urgent action.

“Ms. Haugen whistled for a reason: Our neighbors are in danger and we can hardly wait to take meaningful measures,” said MEP Kathy Castor (D-Fla.) In an email.

MEP Anna Eshoo, D-Calif. Added in an email that Haugen’s “revelations and testimonies make it clear we must act.” Both Castor and Eshoo have co-funded several laws to further regulate social media.

Meta Platforms Inc. spokespersons did not respond to requests for comment.

Section 230 Changes could be a way forward

The House of Representatives is debating various reforms of Section 230, including the lifting of the liability shield on social media companies and a proposal by Malinowski and Eshoo that would narrowly amend it to hold platforms accountable for their algorithms that promote content targeted to Conduct offline violence. No Republican is currently sponsoring or endorsing active legislation.

Malinowski said changes are needed to curb the deluge of content from users that only ignite their passions, all of which is recommended by a non-transparent algorithm.

“Human nature is what it is. What keeps us attached to our screens is usually content that reinforces our pre-existing passions and beliefs and, in particular, arouses our fears and fears, ”he said. “What I would like to see is a new approach to choosing and recommending content that is more based on what users knowingly think is good for them and for the world.”

But while the section 230 amendment might offer lawmakers an opportunity, Adam Kovacevich, founder and chairman of the Chamber of Progress and former Democratic advisor and policy director of Google, said its provisions have encouraged companies to aggressively moderate content, knowing that they cannot be held responsible for doing so.

He also noted that the final changes to Section 230 were made in 2018 with the Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act, which enforces sex trafficking laws at the federal or state level ruled out Section 230 immunity, but Kovacevich said little has been done to combat sex trafficking.

The survey suggests that the public would support changes to Section 230. 65 percent said they would support measures taken by Congress to hold social media companies at least partially liable in courts and lawsuits for the actions of their users. The respondents advocated stronger protection for children on the platforms even more (78 percent) and similarly supported measures by Congress to demand more transparency in the use of algorithms on the websites (67 percent).

Meanwhile, the public also showed strong support for actions taken by the social media companies themselves. 73 percent said they support stricter guidelines for moderating content for harmful content, while 69 percent wanted platforms to expand their capacity to censor and remove content that promote misinformation, hate speech, illegal activity or violence.

Lauren Culbertson, director of US public policy at Twitter Inc., signaled openness to regulating algorithms, but cautioned that Congress should ensure that regulation does not affect competition and should distinguish between illegal and harmful content.

“Regulation needs to reflect the reality of how different services work and how content is classified and amplified, while maximizing competition and balancing safety and freedom of expression,” Culbertson said in an email. “By and large, we agree that people need more choice and control over the algorithms that shape their online experience.”

If social media companies were to be held liable for content posted by users through changes to Section 230, 65 percent of U.S. adults said this should apply to content that is violent or glorifies violence, while 61 percent said the same should apply to content that contains harassment or bullying and content that infringes copyrights, trademarks or intellectual property.

And if companies were held accountable for some of the content posted by users, roughly half of the public believed that content with violence, harassment, and bullying would decline.

Actions in Florida, Texas, could be followed by more “red meat” in the States

In the absence of federal action, Kovacevich said state governments could step in even though early efforts conflicted with the courts.

Both Florida and Texas Passed laws that year that they said would curb alleged conservative censorship on the platforms. In spite of some national support Both states saw their efforts for such proposals locked in court before they could take effect.

In his Block judgment Under Texas law, US District Judge Robert Pitman wrote that the rights of social media platforms to moderate content are protected by the first amendment and, as such, have “editorial discretion” over such content. Despite the lack of success elsewhere, Kovacevich said other republican-dominated states could try to follow suit.

“Both bills have now been passed because they are clearly unconstitutional,” he said. “But Republican lawmakers may still think that getting these laws through is good policy for them, that they are a good way to throw red meat into their ‘MAGA’ base. And then they know that they will be knocked down. “

Another option at the state level could be regulations that encourage greater transparency in content moderation policies and require platforms to disclose their practices, statistics, and other data. Kovacevich warned that this, too, could be problematic.

“There’s an aspect of content moderation that is a bit of a whack-a-mole exercise,” he said. “Bad guys on the internet use certain terms, they use jargon or they update the rules. And sometimes the content moderation of platforms is really best to stay private to stay one step ahead of the bad guys. “

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *