In June, Global Witness and Foxglove found that Meta continued to approve Amharic-language ads targeting Ethiopian users that included hate speech and calls for violence. Facebook has been implicated in spreading hate speech and inciting ethnic violence in Ethiopia ongoing conflict.
Crider argues that Facebook needs to invest more in its moderation practices and protecting democracy. She worries that even the threat of a ban allows the company to deflect responsibility for the problems it has left unsolved.
“I think at the end of the day, the minute a regulator looks at Facebook and looks like they’re going to make them actually do something that might cost them some money, they start howling about censorship and present a false choice that it is either an essentially unmoderated and unregulated Facebook or no Facebook at all,” she says.
And Crider says that there are things the company can do, including “break the glass” measures such as downgrading its heavily promoted live videos or limiting the reach of inflammatory content and banning election-related ads in the run-up to the vote.
Mercy Ndegwa, Meta’s director of public policy for East Africa and the Horn of Africa, told WIRED that the company has “taken extensive steps to help us capture hate speech and inflammatory content in Kenya, and we are intensifying these efforts ahead of the election.” However, she acknowledged that “despite these efforts, we know there will be examples of things we miss or we get wrong, as both machines and humans make mistakes.” Meta did not respond to specific questions about the number of content moderators who speak Swahili or other Kenyan languages, or the nature of its conversations with the Kenyan government.
“What the researchers did was stress test Facebook’s systems and proved that what the company said was bullshit,” says Madung. The fact that Meta allowed ads on the platform despite a review process “raises questions about their ability to deal with other forms of hate speech,” Madung says, including the large amount of user-generated content that does not require prior approval.
But banning Meta’s platforms, Madung says, won’t get rid of misinformation or ethnic tensions because it doesn’t address the root cause. “This is not a mutually exclusive issue,” he says. “We need to find a middle ground between heavy-handed approaches and real platform accountability.”
On Saturday, Joseph Mucheru, Cabinet Secretary for Internet and Communication Technologies (ICT), said tweeted, “Media including social media will continue to enjoy FREEDOM OF THE PRESS in Kenya. Not clear what legal framework NCIC plans to use to suspend Facebook. Government is on record. We are NOT shutting down the internet.” There is currently no legal framework that would allow the NCIC to order Facebook’s suspension, declares Bridget Andere, Africa policy analyst at digital nonprofit Access Now.
“Platforms like Meta have failed completely in their handling of misinformation, disinformation and hate speech in Tigray and Myanmar,” Andere said. “The danger is that governments will use it as an excuse for internet shutdowns and app blocking, when instead it should spur companies to invest more in human content moderation and do it in an ethical and human rights-respecting way.”
Madung also worries that regardless of whether the government chooses to suspend Facebook and Instagram now, the damage may have already been done. “The effects will be seen at another time,” he says. “The issue is that the precedent is now officially out there and it can be referenced at any time.”