Despite some safety changes and growing political pressure, young Instagram users can still quickly gain access to drug-related content, new research shows. Some of those accounts, in fact, appear to be actively selling illegal substances like MDMA, the party drug also known as ecstasy, according to a report by the Tech Transparency Project, a liberal tech watchdog group.
While Instagram has made efforts to curb drug-related hashtags, which remain a core element of the app’s architecture, the group’s research found it was possible to find drug content simply by dropping the hashtag in searches. Searching for “mdma for sale,” not “#mdma,” turned up multiple accounts peddling the drug, the researchers say. This also worked when searching for “oxy”—shortened slang for the opioid oxycontin—and “Xanax,” the anti-anxiety medicine.
The latest research from the Tech Transparency Project follows a study by the group published in December that detailed how teens can access drug content and, in some cases, buy drugs through Instagram. Officially, of course, selling drugs isn’t allowed on Instagram, and Instagram chief Adam Mosseri reiterated the policy during Congressional testimony in December. In addition to removing drug-related hashtags, Instagram has added warning prompts to drug-related searches that provide a link to independent substance-abuse websites. Those efforts aren’t enough, says Katie Paul, the Tech Transparency Project’s director. “Instagram is opposed to actually doing something that will materially address these harms on its platform because they don’t want to cut into their bottom line,” reducing the amount of time a user might spend on Instagram with greater content controls, she says.
The new Tech Transparency Project research highlights the thorny nature of Instagram’s dilemma. It has taken some measures to clean up its app and better protect young users, but the platform remains a place susceptible to misuse and rule-breaking—where Instagram, seeking to police a place with roughly 1 billion monthly users, moves to quash one problem and another (or several) arise someplace else. The app has drawn particular fire from lawmakers over the past year after the Facebook Papers leak revealed internal research into the app’s affects on young people’s mental health. The research suggested it does negatively impact some teens; Instagram has since sought to discredit that internal research, saying it relied on a small sample size, among other problems.
To study drugs on Instagram and teens, the Tech Transparency Project created a series of dummy accounts registered as teenage users, testing the app’s protections for teens. Congressional staff have done similar projects and used the results to support criticism of Instagram and its parent company Meta when their executives appeared on the Hill.
The new Tech Transparency Project research also found holes in Instagram’s hashtag policies. For instance, #fetanyl was blocked, but #fetanylcalifornia wasn’t, and searching “#fetanylcalifornia” produced accounts the researchers say sold the opioid. #Xanax was blocked through a desktop search but remained searchable on mobile. In another example, “#opiates” returned no search results, but Instagram then suggested #opiatesforsale.
Here’s another place where Instagram’s algorithm worked against the app’s ostensible safety practices: When a dummy Tech Transparency Project account run followed @silkroadpharma.cy—a seller of Adderall and the hallucinogen PCP, the researchers say—Instagram recommended other drug-related accounts, including @calipills_415. The latter advertised “discrete shipping” across America, according to the research.
In another moment, a dummy Tech Transparency Project account followed a purported drug-dealing account, @despasitro, and was prompted to follow another, @xanaxsubutexoxycodone. The Instagram profile picture for @xanaxsubutexoxycodone? A heart drawn in white powder next to a small plastic bag.