Type to search

Social Media

America’s growing social media fake news problem on a chart

Share

America’s fake news problem is getting worse, not better.

According to an analysis published by NewsGuard and first reported on Tuesday by Sara Fischer of Axios, websites that provide “unreliable news” have increased their share of social media interactions this year. In 2019, 8 percent of engagement with the top 100 top performing social media news sources was in doubt. In 2020 this number more than doubled to 17 percent.

NewsGuard, which rates news websites for reliability, found that people have been much more concerned with the news this year than they did last year. Interaction with the top 100 US news sources (i.e. likes, shares and comments on Facebook and Twitter) increased from 8.6 billion reactions to 16.3 billion reactions between 2019 and 2020. That makes sense, given everything that happened in 2020, there has been a lot of news and pandemic-related factors like unemployment and bans have given people plenty of time to read things online.

But more and more of the messages people see are problematic, inaccurate, or suspicious. And that’s something to worry about. The analysis found that Daily Wire, the outlet founded by right-wing commentator Ben Shapiro, had 2.5 times more interactions this year than last year.

The blooming of false and unreliable news on the Internet is a cultural, political, and technological phenomenon that is difficult to understand, let alone manage. Conspiracy theories, misinformation, and disinformation are rife on the internet, and it is often difficult for people to tell what is true and what is not. Social media companies aren’t exactly doing a bang-up job of addressing the problem, either.

Right-wing content in particular thrives on platforms like Facebook. But just because someone sees certain content doesn’t necessarily mean they’re particularly influenced by it, and figuring out how powerful certain messages are can be complicated. Over the summer, the New York Times’s Kevin Roose covered what he called the “parallel media universe” of super-conservative content on Facebook, liberal and mainstream. (But just because someone likes a news item doesn’t mean they actually read it.)

As Rebecca Heilweil from Recode emphasized at the time, it is difficult to know what happens on Facebook only through engagement:

There is now an ongoing debate among academics, analysts, and observers like Roose about what we know about Facebook and why. Dartmouth political scientist Brendan Nyhan recently argued that “likes”, comments and shares are only a small fraction of what people actually see on Facebook, and that it is difficult to draw or know conclusions from these interactions alone what a choice they might mean.

Still, the trend is worrying. Social media exacerbates political polarization in America, and often people disagree on even basic facts. What people consume shapes what they see – basically someone clicks on a particular article and algorithms begin to predict what else they might like in accordance with that. And the further they go down the rabbit hole, the more they begin to seek out these media, which often ends up in an information bubble.

For people who complain so much about alleged social media censorship, they don’t really get censored

Republicans have been complaining for years that social media companies are biased against them and that their content is being censored and removed. President Donald Trump has often made unsubstantiated claims of bias against tech companies. He and his administration have also sought to undermine and abolish Section 230, a law which essentially says that social media companies are free to monitor their platforms as they see fit and are not liable for the content that third parties post on them. (Sara Morrison of Recode has a full explanation on section 230.)

Rather than agreeing to a particular political bias, social media algorithms are often outrageous – they promote content that people have an emotional reaction to and are likely to be engaging with. NewsGuard data and other research show that people are increasingly drawn to unreliable content – and often to unreliable content that has a conservative bias. And this content can influence all sorts of attitudes and cause confusion even on basic facts.

The New York Times recently took a look at Georgia and how misinformation and unreliable news played a role in the US Senate runoff election there. A conservative local news network called Star News Group announced it would launch the Georgia Star in November, and NewsGuard’s analysis found that the website was posting misleading information about the presidential election and Senate races. One story of false claims about the Georgian presidential election results reached as many as 650,000 people on Facebook.

Fighting fake and misleading news would require efforts from multiple stakeholders. Yet Facebook recently rolled back changes to its algorithm that would promote news from reliable sources. Given the rate at which the problem is growing, things are likely to get worse without intervention.

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *