Type to search


Restrictions on freedom of expression on social media could undermine harm reduction and addiction restoration efforts


When Chad Sabora started harm reduction, he was working from his car on the streets of St. Louis, Mo. Sabora’s battered limo was a familiar sight in drug-use neighborhoods. Sabora, a Chicago attorney and former prosecutor, has been in recovery for years and experienced her addiction firsthand. Based on decades of research and personal experience, he knew that sterile syringes prevented the transmission of infectious diseases, naloxone saves lives by reversing overdoses, and that timely encouragement or a caring gesture can profoundly help someone with addiction. He took a down to earth approach to helping others in his hometown.

When America’s unprecedented overdose crisis became a national problem, Sabora pondered how to expand his surgery. Like many people, he used social media to try to spread the gospel of harm reduction and share simple strategies to help people survive their addictions. Never use alone. Wear naloxone. Use new syringes. Statistically, there are millions of drug users and addicts online. Tragically, over 200 people die of drug overdoses in America every day, and over 100,000 Americans have died in the last year alone. But on Facebook, Sabora felt that something was preventing him from reaching the masses. Then he realized that his contributions had conflicted with the almighty algorithm.

“I took time off just because I posted about naloxone,” said Sabora. When he was creating educational posts about the risks of illegal fentanyl teaching people how to use fentanyl test strips, his account was disabled. He realized that the mention of drugs on his account had been tampered with by Facebook’s automated content censorship designed to curb drug sales on social media platforms. The algorithm could not distinguish its content from that of a suspected drug dealer. The algorithm picks up certain words, phrases or speech patterns that are marked and suppressed. Whole groups of harm reduction activists have disappeared, along with numerous informative articles and threads. Some accounts have been banned for life.

Sabora was confident that social media tools could make a difference and educate people about harm reduction. Instead, he was silenced by social media censors.

An obscure regulation called Section 230 protects social media companies from being held liable for the questionable content generated by users. Of course, some politicians and activists are calling for Section 230 to be rewritten to encourage tech giants to better moderate content that users post. While there is undoubtedly a credible argument for it, we must also be careful. Rewriting section 230 could backfire. Instead of ending online drug sales, these new rules could further censor activists like Sabora who try to use social media to save lives during an overdose crisis. Congress needs to be careful in drafting regulations on moderation of content related to substance use disorders – as companies are likely to shut down all related discussions to avoid liability.

Section 230 is a decade-old law that governs online language and governs almost every interaction on social media. The law is part of the United States Communications Decency Act of 1996. Section 230 also protects social media platforms from being held responsible for the content posted by users. For example when a QAnon group plans and conducts a treasonous riot in Washington, DC, the website that hosted this group is immune. You cannot be prosecuted for what people post online. Proponents, however, have tried many times to amend section 230 to support their own policy goals.

Sex trafficking is one of the most recent and sensitive cases in which Congress is rewriting Section 230. He claims to protect children and vulnerable people from kidnapping and human trafficking, and advocates pressuring lawmakers to pass a package of laws known as FOSTA / SESTA. That law amended Section 230 by making websites and online platforms responsible for user content that might facilitate “sexual exploitation”. Although the Justice Department stated that FOSTA / SESTA would make prosecutions of sex trafficking more difficult, it was passed anyway. A catastrophe followed. Immediate raids were carried out on websites and some websites became a safe haven for sex workers to verify the full shutdown of their clients. These measures couldn’t slow the sex trafficking down. Actually, the law was only applied once by the federal prosecutor’s office who said they didn’t really need it; they have in the past been able to use other pre-existing laws to prosecute sex trafficking crimes. While FOSTA / SESTA did nothing to help potential victims or apprehend traffickers, it had an immediate negative impact on another vulnerable group: sex workers.

A similar approach could harm people who use drugs and harm reduction advocates like Sabora, who are trying to spread life-saving information. Just as advocates urged Congress to rewrite Section 230 to prevent sexual exploitation, a similar campaign is underway to stop drug sales and curb America’s soaring death rate from overdose. Dreadful Stories Involvement of young adults in purchasing Drugs on Snapchat and TikTok is in abundance. Some parents and advocates would like Section 230 to be rewritten to increase social media companies’ liability for selling drugs on their platforms. But efforts to curb online drug sales through Section 230 carve-outs have been somewhat misguided. Without careful consideration, these reforms would jeopardize the recovery community and harm reduction advocates – and threaten to stifle productive speeches that are vital to advancing the overdose crisis. The 230 carve-outs currently proposed could undermine access to life-saving resources, mandate the removal of broad categories of content, and keep vulnerable populations, including those who navigate supporting services, off the platform. For criminalized communities, the risk of exploitation and harm offline is significant and support and resources can be limited.

Harm reduction efforts – and conversations – are often nuanced and specific to the individual, with the aim of minimizing the harms of substance use. Blanket bans on content, enacted regardless of context and nuance, could punish those seeking help – and hamper legitimate, proven approaches to tackling overdoses.

Rather than largely suppressing freedom of expression and forcing social media companies to eradicate our ability to share resources, the US government should focus its efforts on things that work. To save lives, policymakers need to develop a realistic national strategy to tackle the overdose crisis, including implementing evidence-based prevention, harm reduction, treatment and support services at the community level. Don’t kill the conversation. Instead, we need to coordinate with the locations to identify authentic locations for support. Most of the leading platforms that these conversations take place on have clear rules prohibiting the online sale and advertising of drugs and controlled substances, and companies need to better monitor these efforts. The federal government needs to work with online platforms to coordinate a more effective offender removal strategy and work with law enforcement agencies to prosecute drug traffickers.

There are worlds between syringe swaps and drug deals. Until our government – and our social media companies – recognize this, we will continue to lose friends, loved ones, neighbors, and family members to preventable overdoses. Not because they wanted to die. But because they were silenced – and separated from the people who were trying to help them.

Ryan Hampton is a nationally recognized recovery advocate, community organizer, and person in long-term recovery from addiction. He is the author of “Confused: How Purdue Pharma’s bankruptcy failed victims of the American overdose crisis. “Follow him on Twitter: @RyanForRecovery


You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *