Sat, 15 April 2023
Facebook depends on local partners to flag hateful content. In Ethiopia, those partners say they were ignored.
Weeks before the 2021 murder of an Ethiopian professor, his son and one of Facebook’s trusted partners say they warned Facebook about hateful posts threatening his life. In the wake of his murder, six Ethiopian partners tell Insider that Facebook routinely ignored their pleas to take down hateful content.
Apr 15, 2023, 10:17 AM
The first Facebook post about Professor Meareg Amare appeared five weeks before his murder.
The 60-year-old professor was known to be a pillar of his community in Bahir Dar, the capital of Ethiopia’s Amhara region. He had authored four widely used chemistry textbooks for middle- and high-school students, and sometimes appeared on local TV, where he encouraged young people to pursue the sciences.
On October 9, 2021, a post about Amare appeared on an unofficial Facebook page for Bahir Dar University staff. The user had posted the professor’s picture alongside a caption that read: “His name is Professor Meareg Amare Abrha. He is Tigrayan. We will tell you about how he was hiding at Bahir Dar University and carried out abuses.”
In the months preceding the post, federal forces in Ethiopia — a country of 120 million people with five official languages — had been in turmoil as a civil war tore through Amare’s native Tigray, Ethiopia’s northernmmost region along its border with Eritrea. Though an internet blackout in Tigray made it almost impossible to know exactly what was happening there, human-rights groups documented mass killing, displacement, and destruction by Ethiopian federal forces, Amhara regional forces, and other allied groups. Human Rights Watch documented ethnic cleansing by Amhara forces, with help from Ethiopian federal forces, against Tigrayans in western parts of the region. The conflict had spilled into Amhara, and Tigrayan militants were accused of executing Amhara civilians.
High-profile Tigrayans like Professor Amare were being targeted across Ethiopia.
Civil war in Tigray region
The Ethiopian government fought a brutal war in northern region of Tigray from November 2020 to November 2022.
A map of Ethiopia, highlighting Tigray region where the conflicts between local partners and Facebook arise.
Map: Chay Thawaranont/Insider
On October 10 — one day after the first post appeared — a school friend of Professor Amare’s son, Abrham Amare, called to warn him that a much longer post had gone up on Facebook. Without any evidence, it implicated Amare in the massacres of Amhara civilians and accused him of embezzling funds from the university on behalf of Tigrayan rebels. It included the neighborhood where the family lived.
The posts got hundreds of likes, and, among dozens of comments, users urged people to “get organized and clean them,” which Abrham and others understood as veiled threats.
On October 14, Abrham Amare made a report through Facebook’s public-reporting channel. He received no response.
Unbeknownst to Abrham, the posts targeting his father had also caught the attention of one of Facebook’s so-called “trusted partners.”
To stem the proliferation of hate speech and misinformation, Facebook relied on reporting from civil-society groups and individuals with linguistic and local expertise. These designated trusted partners were asked to flag posts and accounts through a special reporting channel and, at times, they had regular meetings with representatives of Meta, Facebook’s parent company, to advise them on content moderation.
In the case of Amare, the trusted partner told Insider that he flagged the posts through the special channel and again at two Zoom meetings in mid- and late October.
“There were angry people calling for harm against this individual,” that person, who requested anonymity for security reasons, told Insider. “I had an inkling of what would happen.”
Because the threats against the professor were vague, Facebook agreed only to monitor the situation, the trusted partner said.
“One of the things that Facebook said was, ‘We are not arbiters of truth,'” the partner said. “I remember asking, wouldn’t it be better if Facebook was taking down posts, [rather] than having posts stay on the platform that could hurt people,” the partner said.
Insider could not independently corroborate the person’s account, which is being reported here for the first time. Facebook’s agenda for the second Zoom meeting, which Insider obtained, confirmed that the killing of another Tigrayan professor was discussed, but did not specifically mention Amare. A Meta spokesperson did not dispute the trusted partner’s claims.
The account is similar to what other trusted partners in Ethiopia told Insider: Facebook routinely allowed hateful content to remain on its platform and was slow to respond to urgent warnings.
Meanwhile, Amare insisted that he could remain in Bahir Dar.
“I have social capital that I built before you were born,” Abrham Amare remembers his father saying. “I’m an innocent person. I have nothing to hide.” Surely, a community where he’d spent four decades as an educator would not turn on him.
Even as police in Bahir Dar assured him that authorities were not seeking his arrest, people who knew Professor Amare well had become suspicious. A neighbor he’d known for many years confronted him about the allegations. The harassment became incessant. Five days before his murder, Amare resigned from his teaching post.
On November 3, 2021, Professor Amare was killed outside his home.
A group of men wearing Amhara Special Forces uniforms approached him at his front gate, according to a $1.6 billion lawsuit that Abrham Amare brought against Meta late last year. One of the men shot him twice — once in his right shoulder and once in the leg.
As the professor lay bleeding, the group of men encircled him and “chanted the same insulting slander from the inciteful Facebook post,” and kept away anyone who might have tried to bring him to a nearby hospital, the lawsuit said.
“For seven hours, he lay there dying slowly in unimaginable suffering,” it said.
Eight days later, Facebook replied at last to Abrham Amare’s complaint. Meta declined to discuss the case, saying it does not discuss pending litigation. But according to Abrham Amare’s lawsuit, Facebook informed him that post about his father was in violation of its community standards and had been removed.
In fact, the post that had destroyed Meareg Amare’s good name and, his son believed, led to his murder, was still up almost a year later as Abrham Amare was preparing his lawsuit against Meta.
Trusted partners say warnings were ignored
Insider spoke with six current and former trusted partners from Ethiopia who said that Facebook routinely ignored their pleas to take down content that they deemed hateful or likely to incite violence. In some cases, it took four months for the company to respond at all.
One group, the Network Against Hate Speech, ended its relationship with Facebook last month and allowed Insider to review some of its communications with Facebook. At one point, the Network reached out to Facebook to say that 60% of the posts and individual accounts they had reported a month earlier remained online despite having been flagged.
Posts that appeared to celebrate the use of child soldiers remained online months after they were flagged, and the grueling work of reviewing traumatic content had left the trusted partners we spoke to struggling with PTSD and depression.
One member of the network said he got rid of all the knives in his kitchen because they were too triggering after watching so much violent content. A therapist he saw as part of an employee-assistance program said his response suggested he might be suffering from PTSD.
Meta doesn’t provide mental-health resources to its trusted partners, as it does for its in-house content-moderation teams, but some groups that accept Meta funding do offer psychological-support services.
“We aim to review content reported to us by Trusted Partners as quickly as we can,” Facebook said in a statement to Insider. “The amount of time it takes us to review content can depend on a number of factors, including, but not limited to, the complexity, volume of content and the format in which it is shared. We also provide training and resources for our partners, such as tailored reporting templates and an online learning platform, to help them share content in the most effective way possible.”
The complaints of the six trusted partners echoed allegations made in Abrham Amare’s lawsuit, which accused Meta of “blatant human rights violations and human suffering caused by its business decisions,” and called for changes to Facebook’s content-moderation practices.
“In a country where there was an unprecedented civil war and widespread ethnic violence, to take several months to review and take down consequential hate speech is to allow it to cause violence until it’s dealt with,” a member of the Network Against Hate Speech told Insider.
This is not the first time that Meta has faced allegations of this kind.
The slaughter of Rohingya Muslims in Myanmar in 2017 exposed how failures in content moderation could be exploited by bad actors to stir up hatred.
Human-rights groups documented how “a storm of hatred” was allowed to flourish on Facebook in the months leading up to those massacres. The following year, Facebook CEO Mark Zuckerberg said the company had brought on content reviewers to monitor hate speech in Burmese.
Meta has also faced accusations that its algorithm allows for the rapid spread of disinformation and has helped propel authoritarian leaders to power in India and the Philippines.
Meta’s global network of trusted partners now includes more than 400 NGOs in hot spots where it may not otherwise have a presence. The company often references its reliance on local experts like its trusted partners when it is accused of fanning the flames of another ethnic conflict or spreading disinformation.
But details about the program are fuzzy.
A spokesperson for Meta said that the Trusted Partner program launched in 2012, but declined to provide additional details on how many partners they onboarded at the time or the nature of their responsibilities. Since at least 2017, the company has collaborated with third-party fact-checkers to monitor misinformation on its platform. And in 2019, it announced a new initiative to collaborate with subject-matter experts like journalists and rights groups to fight misinformation on Facebook and the company’s other platforms.
“Our Trusted Partner Program is industry-leading and part of Meta’s broader trust and safety efforts and engagement with civil society,” the Meta spokesperson said.
In Ethiopia and elsewhere, Meta provides grant funding to partner organizations that operate independently of the social-media giant.
All six of the Ethiopian trusted partners interviewed for this story belong to one of Meta’s designated trusted partner groups, including the media group Internews, a global media nonprofit that began working with Meta in Ethiopia in 2019, and the Network Against Hate Speech, which says it declined to accept funds from Facebook. Members of a third group asked that it not be named so as not to jeopardize the grants that Facebook made to some of their projects.
Some of the trusted partners declined to be named because they’ve faced death threats and fear for their own safety. In several cases, partners declined to be named because they had signed nondisclosure agreements with Facebook. A Meta spokesperson said that no one was asked to sign NDAs as a part of the Trusted Parter program, but that NDAs are used in relation to other types of work with Facebook.
While the Network Against Hate Speech has cut ties with Facebook, other trusted partner organizations say they fear that speaking out about the program’s deficiencies would jeopardize the vital funding they get from Meta.
“Local civil-society organizations are very weak, they accept the money because they need it to work on projects,” one Ethiopian trusted partner told Insider. “But they shut down their mouth when it comes to criticizing Facebook. They keep quiet. Just like I’m doing right now by choosing anonymity.”
The Meta spokesperson denied the claim that it “silences” partners and said it welcomes their feedback.
A ‘Band-Aid’ solution
Ethiopia, Africa’s second largest country by population, ranks near the bottom of global freedom-of-speech indexes, with limited press freedom and media literacy. The federal government, under Prime Minister Abiy Ahmed, has at times blocked internet access, especially in regions embroiled in conflict. Platforms like Facebook are often the only place Ethiopians can get information and share their views.
There are 7 million Facebook accounts in Ethiopia, and content that’s shared there is often disseminated further through Telegram and WhatsApp, an encrypted-messaging service also owned by Meta.
But even without direct access to a post or social-media platform, misinformation that originated online can spread by word of mouth, one trusted partner said. And savvy Ethiopian internet users are able to circumvent blocked sites with the use of a VPN.
The Meta spokesperson told Insider the Trusted Partner program is part of a larger Trust and Safety team at Meta and is jointly managed by the company’s Content Policy and Global Operations teams. The company’s Content Policy team leads in defining the program’s strategy, training, and outreach whereas the company’s Operations team is responsible for acting on Trusted Partners reports. The spokesperson said partners also work closely with regional Public Policy teams, who are responsible for managing relationships with local partners. The Content Policy team leads in the definition of program strategy, develops training materials, and coordinates outreach with Trusted Partner organizations.
According to Abrham Amare’s lawsuit, content posted in three languages in Ethiopia — Amharic, Tigrinya, and Oromo — is principally moderated from Nairobi, Kenya. Moderators are given “impossible targets in terms of the volume of content to be reviewed per day” and make decisions about posts within “a matter of seconds,” the lawsuit said. Since May of last year, Meta has faced multiple lawsuits related to its East African content moderators on Facebook and Instagram, who say they endured psychological trauma as a result of their work.
According to Berhan Taye, the former Africa policy manager at Access Now who has connected civil-society organizations in Ethiopia with Facebook, the larger question is why a multibillion-dollar company is relying on a network of under-resourced social organizations to help moderate hate speech in crisis zones, where its platforms could be proliferating violence.
“It’s like putting a Band-Aid on something that needs surgery,” said Taye, who notes that trusted partners often provide Meta with the linguistic and local expertise at a fraction of what it would cost to employ such experts directly. “This is a multibillion-dollar organization that should be doing this work themselves.”
Warnings made in vain
The Network Against Hate Speech, an all-volunteer digital-rights group, began working with Facebook in January of 2021.
During their two-year-long relationship, the company chronically failed to reply to messages, left material that had been flagged on the platform for months, and was disorganized in ways that raised security concerns for its partners, according to several of the network’s members and a review of its communications with Facebook.
The network first sat down with Meta in December 2020, at a meeting set up by Taye, and agreed to work with the platform.
For their own security, the network insisted on one ground rule. At that initial meeting, which included the company’s policy manager, the members requested that they be allowed to correspond with Meta through a group email and leave off the identity of the individual member. According to members who attended, Meta agreed.
Several weeks later, after several exchanges flagging content to Facebook, the network got a response from that same manager that took them by surprise: It appeared the manager didn’t remember who they were, or the agreement about how they could safely submit information to Facebook.
“As you can imagine, having direct access to Facebook staff to report possibly violating content is not an option to all users,” the manager wrote in an email. “I will need to know who comprises of the Network and some more context on your Network, in order to continue reviewing the content you share here.”
In its reply, the network explained that most of its members had been present and “well introduced” at the meeting, and that the group had “explained why we wanted to remain anonymous.” It continued: “We have received concrete threats of attack from Ethiopian social media users.”
Another trusted partner working for a different group echoed the concern that Meta appeared disorganized, especially given that communications were usually sent to a generic email for trusted partners at Facebook, rather than a single handler.
“There is a lack of clarity regarding the focal points for who to talk to about different things and who should be kept in the loop,” the person told Insider.
A Meta spokesperson confirmed that there are multiple teams at the company that engage with the Trusted Partner program.
Meta’s layoffs late last year also impacted this group’s relationship as several employees they routinely corresponded with were let go, the person said.
Meta would not comment the general complaints that the network’s members raised with Insider, but said it aims to respond promptly when posts and accounts are flagged.
But the central issue that led the Network Against Hate Speech to end its relationship with Meta was the persistent sense that their work was in vain.
Late last year, as Ethiopian and Tigrayan forces approached a truce, the country’s other major conflict, in the region of Oromia, was heating up.
In October, the network reported over 367 posts and individual accounts promoting the violent activities of the Oromo Liberation Army (OLA). The group has been implicated in the killings of Amharas, according to human-rights groups. In November of 2021, Meta released an updated policy that promised to remove content supporting or praising the OLA’s violent activities.
Another major conflict in Oromia
As the Tigrayan conflict approached a truce, Ethiopia’s other major conflict in the region of Oromia was heating up.
A map highlighting Oromia region in Ethiopia
Map: Chay Thawaranont
When the network didn’t hear back immediately, they sent their takedown requests to the public-policy manager from their initial meeting. Despite several follow-up attempts and promises from Meta that the requests would be escalated, the group did not receive a response until January 12, 2023.
“The October escalation was fully actioned,” Facebook’s reply said, without providing any details on what, if anything, had been taken down.
According to several of the network’s members, and confirmed by Insider, a majority of the posts remained online.
As of April 5, around 70% of the links the Network Against Hate Speech reported in October are still active.
At least seven posts that the network submitted, which were reviewed by Insider, contained videos showing young children with AK-47s and handguns and appeared to glorify the use of child soldiers. Many of these videos first appeared on the platform in January last year, and were still up as late as the end of February.
The October email also included hundreds of accounts that were promoting violent activity of the Oromo Liberation Army. In one set of images, accompanied by a smiley face and laughing emoticons, showed OLA soldiers posing with captives and photos of naked men lying dead on the ground. The profile was still live as of January 26, but has since been removed.
Such posts can be an effective recruiting tool, members of the network said.
Several other accounts that the network reported in October are still active. This includes one user, whose cover photo shows an AK-47 with an OLA flag, who routinely posts photos and videos of OLA activity, including combat scenes and children posing with guns.
Insider flagged an additional 17 posts from profiles reported by the Network Against Hate Speech to Meta. In response, Meta took down two posts and added a content warning on a third. But over a dozen links — including what appears to be a photo of a mass grave, a dead fighter, and OLA members engaged in active combat — remained online at the time of reporting, over 48 hours after the publication first brought the posts to the platform’s attention.
The Meta spokesperson said the platform had taken down two posts flagged by the Network Against Hate Speech, and included a content warning about a third. Meta did not comment about whether posts that stayed up had violated its policies.
In the end, the efforts of the partnership had come to feel futile, and the network’s members decided to pursue other avenues for advocacy against hate speech.
The Network Against Hate Speech ended its relationship with Meta on March 21, 2023.
Depression and PTSD
For the trusted partners in Ethiopia who had signed on to work with Facebook at a time of acute crisis in their country, the long response times and the pushback they often got was demoralizing.
One partner working with the Network Against Hate Speech said that he’s lost the desire to play with his kids after school or meet with friends over a beer. A few months ago, a psychiatrist told him he was exhibiting signs of depression.
“The psychological impact is huge,” the person said. “I cannot relax because I can’t ignore the many horrible videos and hate speech I’ve seen. After a while, I feel stuck there. Like I can’t talk with normal people.”
Still, he didn’t see taking a break as an option. “I try to contribute my part. When I see content, I cannot ignore it.”
After Professor Amare was killed, the trusted partner who said he flagged the posts was plagued by guilt and left wondering if there was any point to his working with Facebook.
“When I saw that he was dead, I was really angry,” the partner said.
“Hindsight is always the worst,” the person continued. “But regardless of my deficiencies, I think the platform had more responsibility.”
Multiple trusted partners in Ethiopia said hate speech is still proliferating on the platform.
Investments in growth but not protection
Often in organized episodes of mass murder, calls for violence come in coded language that would be difficult for a moderation algorithm to detect without a human interpreter. In the 1994 Rwandan genocide, for example, such euphemisms included “cut down the tall grass.”
In Ethiopia, phrases like “get organized and clean them” or “take action” appeared in posts and were understood to be calls to violence, according to Abrham Amare’s lawsuit and the trusted partners who spoke to Insider.
The partners said that Meta would frequently disagree with their assessments of what constituted a serious and actionable violation of Meta’s policies.
Meta employs a wide range of mechanisms to mitigate the harm of hate speech and graphic content on the platform, from shadow-banning posts to limit their reach to removing the content entirely. The platform rarely tells trusted partners what type of action it has taken on the posts they report.
In the wake of the massive Facebook Files leak, Frances Haugen — the whistleblower who testified before Congress in October 2021 — cited Ethiopia as a place where the algorithm’s priorities fuel misinformation wildfires.
“It is pulling families apart,” she testified. “And in places like Ethiopia it is literally fanning ethnic violence.”
Meta has consistently denied the claim that its algorithm feeds the spread of hateful content. “Our advertisers don’t want to see it, and we don’t want to see it. There is no incentive for us to do anything but remove it, ” the company wrote in one blog post.
But moderating hate speech and misinformation is also one of Meta’s costliest problems, as the Facebook Files revealed, and East Africa is not an especially profitable market.
Just 10 percent of Facebook’s revenue comes from Rest of World, the accounting term used for countries outside of North America and Western Europe, which includes Ethiopia and dozens of other countries in the global south. In tech speak, Ethiopia is a low average-revenue-per-user country.
“At its core, we’re not making that much money for them,” Access Now’s Taye said. “But they know we are the next billion users.”
While Facebook is investing and bringing more users online in the global south, Taye said it isn’t investing nearly enough in the tools needed to protect those people.
“The EU can summon Mark Zuckerberg to Brussels. The UK can summon them to Parliament. The US can do it in Congress. But we have no power,” she said.
The Meta spokesperson denied that it does not take hate speech regarding Ethiopia seriously.
Rafiq Copeland, a senior adviser at InterNews, one of Meta’s longest-standing trusted partners globally, told Insider that the core complaints of trusted partners in Ethiopia have come up in other Rest of World countries. “These are common experiences, common frustrations,” Copeland said.
But a system for catching hate speech remains essential, and Copeland said that Meta should redesign the Trusted Partner system with input from its global partners.
“For the safety benefit it brings, the Trusted Partner program is the best value Meta is ever going to get,” he said. “It provides a channel for civil society to raise issues and share information that has a potential to save lives.
“But we’ve also had a lot of frustrations with the program,” he added.
A father’s story
Professor Amare was buried without a coffin in an unmarked grave by two civil servants. No mourners were present. It would take the help of a local NGO for his family to identify where Amare had been laid to rest.
“We weren’t allowed to cry out loud,” Abrham said. “The whole atmosphere was against Tigrayans.”
To this day, the identity of the killer is unknown.
For Abrham, one of the hardest things has been contending with the destruction of his father’s good name.
As the family hunkered down in Addis Ababa, the capital, until they were able to leave the country, Abrham got started on an online memorial — the obituary he believed his father had been denied.
Some things Abrham knew: Both a devout Christian and a man of science, his father had for years covered school fees and bought school supplies for underprivileged kids in Bahir Dar. He enjoyed traditional Tigrigna music, and on long drives he’d sing along to the music of Yemane Ghebremichael and Tsehaytu Beraki, Eritrean musicians he adored.
Professor Amare had produced 47 scholarly articles, nine of which were published posthumously by supportive colleagues and coauthors. He was due to travel to scientific conferences in Europe, Asia, and the United States in the coming years to discuss those works.
Ironically, Facebook had once been a source of happy news for the family: In early 2021, when the professor received a major promotion, he was too humble to break the news himself, and they learned of it from the university’s official Facebook account.
Abrham reached out to his dad’s friends and colleagues to learn more about the professor’s life.
https://cdn.jwplayer.com/players/88fCDabq-XildMFVD.html
“He was a nice man,” Abrahm told Insider. “But sometimes being nice, being civil, costs a lot.”
Even in Addis Ababa, it seemed that everyone knew about the Facebook posts, and many people now saw him as a traitor. “I received a lot of rejections,” Abrham said, about some of the people he called to speak about his father.
In August, Abrham left Ethiopia and published his father’s memorial online. He will not be posting it on Facebook.
The media’s interest in the lawsuit has given him more opportunities to talk about the man his father was, even if some details remain deeply painful. “It gives me a chance to remember the good times,” he said during one interview.
The first hearing in his case against Meta is now scheduled to take place on April 19 in Nairobi, Kenya.
“The cycle needs to stop somewhere,” Abrham said. “I don’t want other families to suffer the same things we suffered. I want to stop these kinds of things from happening again.”
For Abrham, the case, and the media attention it’s brought, is another exercise in reclaiming his father’s legacy so that he may be remembered the way those who loved him did.