Strengthening media literacy to win the battle against misinformation
(Illustration from iStock / axel2001)
The deliberate or unintentional dissemination of misinformation remains widespread despite widespread public attention and has recently manifested itself in the form of false claims about COVID-19 vaccines, the Capitol Riot, and many other topics. This “infodemia” polarizes politics, endangers communities, weakens institutions and leaves people unsure of what to believe or whom to trust. It threatens the very foundations of democratic governance, social cohesion, national security and public health.
Misinformation is a long-term problem that requires long-term, sustainable solutions as well as short-term interventions. We have seen a number of faster, technological fixes that improve the social media platforms that serve information. Companies like Facebook and Twitter, for example, have adjusted their algorithms or declared problematic content. We’ve also seen slower, human-centered approaches that make people smarter about the media they need online. Evidence-based education programs, for example, have made people better assess the reliability of information sources, distinguish facts from opinions, resist emotional manipulation, and be good digital citizens.
It wasn’t enough. If we are to stop misinformation and its insidious effects, we must radically expand and accelerate our counterattacks. It will affect all sectors of society: corporations, nonprofits, advocacy groups, philanthropists, researchers, governments, and more. We also need to balance our efforts. For too long, too many resources and debates have focused on changing technology, not educating people. This emphasis on the supply side of the problem without similar investments on the demand side can result in less effective use of time and energy.
Although technology-centric, self-regulating solutions – filtering software, artificial intelligence, modified algorithms, and content labeling – are capable of making changes quickly and on a large scale, they are subject to significant ethical, financial, logistical, and legal constraints.
On the one hand, social media business models thrive on engagement that stimulates emotionally charged and free-flowing content. Tech leaders like Facebook founder Mark Zuckerberg are reluctant to take action on freedom of expression concerns and have tried to avoid political debates until pressured. When they take action, they are checked for an inconsistent approach. In addition, research shows that some of the most widely used methods of combating misinformation on social media – such as fact-checking banners – have little impact on the likelihood of deliberately believing misleading news, and some even backfire. And because people often have a deep-seated desire to share their knowledge with others – especially information that seems threatening or exciting – tech companies can only go so far as to regulate content. There is also the challenge of volume. Tech platforms struggle to keep up with the many forms and producers of disinformation. Stopping them is akin to an endless high-stakes game of Whac-A-Mole.
Faced with these challenges, we need to invest more in people-centered solutions that focus on improving people’s media and information literacy. Not only do they show a much deeper and longer lasting effect, but they may also be easier and cheaper to implement than is generally believed.
Research by RAND Corporation and others shows that media and information literacy improves critical thinking, awareness of media bias, and the desire to consume quality news – all of which help reduce misinformation. Even brief attendance at some training can improve media literacy skills, including a better understanding of the credibility of news or a more robust ability to assess bias. Media literacy has a stronger influence than political knowledge on the ability to judge the correctness of political messages independently of political opinion. Digital media literacy has reduced the perceived accuracy of hoaxes, and training remains effective when delivered in different ways and by different groups.
Media competence training has a lasting effect. A year and a half after going through a program run by IREX (a nonprofit the writers work for), adults were still 25 percent more likely and 13 percent more likely to distinguish between disinformation and objective reporting, the likelihood they checked multiple news sources. In Jordan and Serbia, too, the participants in the IREX training courses improved their media skills by up to 97 percent.
Media literacy programs can also be offered inexpensively and extensively through schools. Finland and Sweden introduced media literacy into their education systems decades ago with positive results, and Ukraine is starting to do the same. In the UK, young people trained in schools showed an improvement in media literacy.
Critics may say that improving people’s media literacy and other people-centered solutions are resource-intensive and will not address the problem quickly enough or on a sufficient scale. These are real challenges, but the long-term effectiveness of such programs is exactly what is needed in the never-ending battle against misinformation. We need to invest more in them as we continue to pursue technological solutions, or we may never create and sustain the well-informed citizenship that healthy democracies demand.
Efforts require the cooperation of all sectors of societies around the world to fully understand and resolve the problem. We need nonprofits and advocacy groups that sound the alarm to the people they serve. We need philanthropists to increase funding, to scale solutions. We need more researchers to provide evidence-based answers on the full extent of the problem and the effectiveness of corrections. We need governments that integrate media literacy standards into schools and create incentives for training. We need technology companies that do more than optimize their platforms – they also need to invest in training their users.
The tools to mitigate the power of misinformation are in our hands, but we must work smarter and faster, or risk losing an increasingly intense battle. There is still a lot of learning to do, coalition building, scaling and communication to “get out of information bankruptcy”. Solutions are complex, but within our reach. And the consequences of inaction are catastrophic: the increasingly serious and invasive destabilization of our societies and our daily lives as lies trample on the truth.