TBILISI: Deciding its first ever cases, Facebook’s Oversight Board ruled this week that the social media company was wrong to remove four of five pieces of content the board reviewed.
The items to be reinstated include a post about COVID-19 “cures” in France, a post from Myanmar flagged as anti-Muslim hate speech, photos from Brazil that showed female nipples, and an alleged quote from Nazi propaganda minister Joseph Goebbels.
Of the five cases it took up, the board agreed to uphold one ban: a post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption that Facebook said indicated “disdain” for Azerbaijani people.
Here’s what you need to know about the Armenia and Azerbaijan post and its implications for content moderation:
What is the oversight board and how does it work?
Faced with moderating the posts of over 1.6 billion active daily users, Facebook’s staff and artificial intelligence have been criticised for not consistently applying the platform’s own rules about harmful and misleading content.
Rights groups and lawmakers say the platform is often too slow to remove harmful content like hate speech and too quick to remove posts deemed politically or culturally important but which may also be disturbing or violent.
In response, Facebook has committed $130 million to a new oversight board staffed with rights experts from around the world who will have the final word on what content is allowed.
Thomas Hughes, the board director told Bread & Net, a digital rights conference, that the new body was “there to hold social media companies to account for their decision, to interrogate and dig down into cases”.
What has the reaction been to the board?
Some experts say the board is a step toward better management of content moderation but it has also been criticised for moving too slowly, and not weighing in on viral misinformation before the 2020 U.S. presidential election.
There are also concerns about how one board can field cases from all over the world and account for regional context.
Hughes said: “there’s no way it could represent every region, every language, every issue,” adding that the board plans to expand to 40 members (from 20), and will solicit comments from the general public and experts when deciding cases.
What is the Armenia and Azerbaijan case about?
The case revolves around a post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption asking where they had gone.
The user stated that while Armenians were restoring mosques, churches in Azerbaijan were being destroyed, using a derogatory term for Azerbaijani people and saying they had no history.
The post ended with a call to end “Azerbaijani aggression” and “vandalism”.
Facebook said the caption indicated “disdain” for Azerbaijani people and breached its hate-speech rules.
The user who posted the image appealed the decision, arguing the post was intended to highlight the destruction of cultural and religious monuments.
Why is the case so sensitive?
The post came as majority Christian Armenia and mainly Muslim Azerbaijan fought a six-week war over Nagorno-Karabakh, an enclave internationally recognised as part of Azerbaijan but mainly populated by ethnic Armenians.
Different interpretations of the region’s history have long played as backdrop to the decades-long territorial dispute, and the latest round of fighting has put a spotlight on the preservation of cultural heritage.
Photos of mosques fallen in disrepair in territories gained by Azerbaijan have caused anger in the country, while Armenians have raised concerns over the fate of churches and shrines in lands handed over to Baku’s control.
Katy Pearce, a professor of communication at the University of Washington specialising in digital content in ex-Soviet countries, said Facebook’s board likely chose the case to set a precedent for moderating content in conflicts between two groups who “do not like each other”.
Ilgar Velizade, an independent political analyst based in Baku, said hate speech between Azerbaijanis and Armenians on Facebook has grown significantly since fighting erupted in September.
“Facebook is trying to regulate the growth of frustration, reduce tension in discussions,” he said.
What impact might the case have?
Whatever the board’s verdict, it would likely have displeased many without affecting what others will post, said Pearce, adding that both Armenians and Azerbaijanis often see the actions taken by Facebook administrators as biased.
“Whatever the result of the case is, the ‘loser’ will complain,” she said.
“I do not think that it will have any impact on the content that Armenians and Azerbaijanis post and will likely result in people posting content more creatively to avoid being moderated.”
Velizade added that “Facebook or any other social network is unlikely to be able to influence the mood of people who have been forming over the years under the influence of real, not virtual, circumstances.”
Support Ethical Journalism. Support The Dispatch
The Dispatch is a sincere effort in ethical journalism. Truth, Accuracy, Independence, Fairness, Impartiality, Humanity and Accountability are key elements of our editorial policy. But we are still not able to generate great stories, because we don’t have adequate resources. As more and more media falls into corporate and political control, informed citizens across the world are funding independent journalism initiatives. Here is your chance to support your local media startup and help independent journalism survive. Click the link below to make a payment of your choice and be a stakeholder in public spirited journalism