Type to search

Social Media

The potential billing of Facebook: Social media should be held responsible for the damage they do.

Share

In mid-September, the Wall Street Journal published leaked internal Facebook documents about the harmful effects of Instagram on teenage girls. According to the internal report, the app increases the prevalence of body image problems and suicidal thoughts in teenagers. The company even planned to launch an Instagram for kids to attract more users, which it recently abandoned in the face of the scandal. What a shocker.

Social media platforms have become so dangerous in large part because of the algorithms that determine what content users see. Facebook’s algorithms prioritize content that appeals to people, which is mostly done through outrage. Provocative posts attract users to comment on and interact with a post. This leads to the prioritization of dangerous content.

These algorithms radicalize people into extremism and often contribute to the spread of misinformation. They fueled QAnon, the anti-Vax movement, and electoral conspiracies that led to the January 6 uprising. Facebook has also been used as a tool by authoritarian governments around the world. This includes promoting right-wing extremist campaigns in the US and Brazil, inciting genocidal violence in Ethiopia and Myanmar, and enabling espionage and surveillance by China and Iran, as well as providing a platform for Russian international influence and recruitment by terrorist groups.

Facebook claims it would be irrational to use this type of algorithm as advertisers avoid association with harmful content. But the proof lies in the pudding. Facebook is full of harmful content and advertisers keep coming back.

In addition to its own platform, Facebook has Instagram and WhatsApp, and the company has used predatory practices in the past to capitalize on the social media market.

Since the leak, the former Facebook product manager Frances Haugen appeared in a 60-minute interview as a whistleblower and has since testified to the congress that the company preferred the profit for the good of its users and deliberately hid the damage caused by its platform.

Unfortunately, Facebook has no incentive to make its users decide on profits. The company is protected from impact by Section 230 of the Communications Decency Act, 1996, which states that online intermediaries cannot be held responsible for the information posted on their platform. This means that Facebook can regulate itself. Obviously, the social network is not doing enough to handle the widespread human and drug trafficking on the platform, and it leaves much of the misinformation, cyberbullying, and violence accessible.

On October 4, the day after Haugen’s 60-minute interview and the day before her testimony to Congress, there was a major outage on Facebook and all of its subsidiary platforms for about six hours. A few days later there was another blackout.

Sometimes this story felt like a well choreographed soap opera story. Haugen timed her interview perfectly to make sure people knew about the documents. Then Facebook failed, highlighting our society’s reliance on the app, and its shortcomings were exposed to the government the next day.

Still, the global outage in many countries showed the public benefit of Facebook. WhatsApp, Facebook’s encrypted messaging platform, is required for communication in Brazil, India and Indonesia and is vital in war-torn Afghanistan and Syria. Small businesses around the world rely on Facebook and its apps, with many of them reporting lost revenue due to the outage. For some in developing countries, Facebook even serves as the only connection to the Internet.

Facebook’s websites are undeniably vital around the world. According to Haugen’s testimony, the company and its CEO and Chairman Mark Zuckerberg have the power and authority to maintain and abuse people’s dependence on its services.

Controversy is nothing new to Facebook. In 2018 it was discovered that Facebook had divulged the data of millions of users without consent to the British company Cambridge Analytica in order to sell it to right-wing campaigns in the US and UK in 2016. This scandal showed that the users themselves were Facebook products.

Since then, the company has tightened data security, but continues to use, collect and share people’s data with third parties for targeted advertising. Although this scandal exposed a massive invasion of privacy by Facebook and many threatened to quit in response, the company continued with only a small fine and no drop in its monthly active users.

We grew up with the advent of social media and it’s a staple in our lives that makes it hard to step back. Social media is an important way to connect and share information. Even on campus, many student organizations and university departments use Instagram to share events and information. Our relationship with social media became even clearer during the pandemic when online platforms became the only way to connect with one another.

Despite its many grievances, Facebook is an omnipresent part of our lives. His politics harm people and nations in a tangible way and shape discourse around the world.

Even so, there has been little drive to change things in the US due to lobbying and a lack of consensus on the solution. Global legislation is mainly aimed at increasing responsibility for self-regulation, with the notable exception of the European Union’s General Data Protection Regulation, which was introduced in 2018 in response to the Cambridge Analytica scandal, and the storage and use of personal data in the region regulates.

We need to ditch Facebook’s platforms in a meaningful way if we are to see real and lasting change. Otherwise, this cycle of power and influence will continue uncontrollably for generations.

Social media is a useful tool that connects people around the world, but it shouldn’t be used at the expense of humanity.

Ruhika Chatterjee is a junior from Princeton, NJ, studying molecular and cell biology.

Tags:

Leave a Comment

Your email address will not be published. Required fields are marked *