Johnsen’s experience is common in the pro-choice activist community. Most of the people who spoke to WIRED say that their content appeared to have been removed automatically by AI, rather than being reported by another user.
Activists also worry that even if content is not completely removed, its reach may be limited by the platform’s AI.
Although it’s almost impossible for users to figure out how Meta’s AI moderation is implemented on their content, last year announced it would be to place less emphasis on political content and news content in users’ news feeds. Meta did not answer questions about whether abortion-related content is categorized as political content.
Just as the various abortion activists who spoke to WIRED experienced varying degrees of moderation on Meta’s platform, so did users in various locations around the world. WIRED experimented with posting the same phrase, “Abortion pills are available by mail,” from Facebook and Instagram accounts in the UK, US, Singapore and the Philippines in English, Spanish and Tagalog. Instagram removed English posts of the phrase when sent from the United States, where abortion was recently restricted in some states after last week’s court ruling, and the Philippines, where it is illegal. But a post from the US written in Spanish and a post from the Philippines on Tagalog were both up.
The phrase remained up on both Facebook and Instagram when published in English from the UK. When it was posted in English from Singapore, where abortion is legal and widely available, the phrase remained up on Instagram but was marked on Facebook.
Ensley told WIRED that Reproaction’s Instagram campaigns on access to abortion in Spanish and Polish were both very successful and saw none of the problems the group’s English-language content has faced.
“Meta in particular is quite dependent on automated systems that are extremely sensitive in English and less sensitive in other languages,” said Katharine Trendacosta, associate director of policy and advocacy at the Electronic Frontier Foundation.
WIRED also tested Meta’s moderation with one Scheme 1 substance that is legal for recreational use in 19 states and for medical use in 37 states, and shares the phrase “Marijuana is available by mail” on Facebook in English from the United States. The post was not selected.
“Content moderation with AI and machine learning takes a long time to set up and a great effort to maintain,” says a former Meta employee familiar with the organization’s content moderation practice, who spoke on condition of anonymity. “When circumstances change, you have to change the model, but it takes time and effort. So when the world is changing fast, these algorithms often do not work at their best and can be enforced with less accuracy during periods of intense change. “
But Trendacosta is concerned that law enforcement may also mark content for removal. I Meta’s 2020 transparency the report, the company noted that it had “restricted access to 12 items in the United States reported by various state attorneys related to the promotion and sale of regulated goods and services, and to 15 items reported by the U.S. Attorney General as allegedly involved in price falls.” All positions were later reinstated. “State attorneys are able to just say to Facebook, ‘take this down,’ and Facebook does, even if they eventually set it up again, it’s incredibly dangerous,” Trendacosta says.