When Chad Sabora started working in harm reduction, he worked out of his car on the streets of St. Louis, Mo. Sabora’s beat-up sedan was a familiar sight in neighborhoods frequented by people who use drugs. Sabora, an attorney and former prosecutor in Chicago, had been in recovery for years and experienced addiction firsthand. Based on decades of research and his own experience, he knew sterile syringes prevented infectious disease transmission, naloxone saves lives by reversing overdoses, and that a well-timed pep-talk or caring gesture could profoundly help someone in the throes of addiction. He took a boots-on-the ground approach to helping others in his hometown.
As America’s unprecedented overdose crisis became a national issue, Sabora thought of ways to scale-up his operation. Like many people do, he took to social media, where he tried spreading the gospel of harm reduction and sharing simple strategies to help people survive their substance use disorder. Never use alone. Carry naloxone. Use new syringes. Statistically speaking, there are millions of drug users and people with addiction online. Tragically, over 200 people die from drug overdoses every day in America, and over 100,000 Americans died in the last year alone. But on Facebook, Sabora felt something was keeping him from reaching the masses. Then he noticed his posts ran afoul of the almighty algorithm.
“I’ve been put in time-out just for posting about naloxone,” Sabora said. When he created educational posts about the risks of illicit fentanyl, teaching people how to use fentanyl test strips, his account would be disabled. He realized that by mentioning drugs, his account was dinged by Facebook’s automated content censors meant to curb drug sales on social media platforms. The algorithm couldn’t distinguish his content from that of a suspected drug dealer. The algorithm picks up particular words, phrases, or speech patterns that are flagged and suppressed. Entire groups of harm reduction activists have disappeared, along with scores of informational posts and threads. Some accounts have been banned for life.
Sabora was confident he could use social media tools to make a difference and help educate people about harm reduction. Instead, he found himself silenced by social media censors.
An obscure regulation called Section 230 shields social media companies from being held liable for the questionable content generated by users. Naturally, some politicians and activists are calling to rewrite Section 230 in order to incentivize tech giants to do a better job at moderating content that users post. While there is unquestionably a credible argument to do so, we must also be careful. Re-writing Section 230 could backfire. Instead of ending online drug sales, these new rules could further censor activists like Sabora who are trying to use social media to save lives during an overdose crisis. Congress must be cautious when crafting content moderation regulations around substance use disorder—as companies are likely to shut down …….