Facebook Oversight Board: First Decisions Published

*This is an AI-powered machine translation of the original text in Portuguese

As previously discussed in this Jota on several occasions (some examples here, here, and here), Facebook has a review board for its actions, known as the "Facebook Oversight Board." According to its Rulebook (available here), the Oversight Board was created to "make binding decisions based on principles and independently." These decisions pertain to "content that Facebook and Instagram may allow or remove from their platforms, based on respect for freedom of expression and human rights."

This creation objective is directly aligned with the UN Principles for Business and Human Rights (UN BHR Principles), and it reflects on the Board's merit decisions both the internal rules of the platforms and international human rights treaties and statements from authentic interpreters of these norms, such as the UN Human Rights Committee.

The use of international treaties in regulating the actions of companies and, especially, a company of Facebook's magnitude, brings various consequences. As recognized by the United Nations General Assembly (UNGA) mutatis mutandis, the set of regulations applicable to companies is already well developed, but there is still much to discover in terms of the consequences of applying the UN BHR Principles in the technology sector (UN Documents A/75/212, ¶96-7). This is because, even with the endorsement of the UN, the UN BHR Principles still have limited application.

In this scenario, due diligence regarding human rights impacts depends on multifactorial issues. Among them are included the business activity being conducted, the size of the company, the context in which the activity is carried out, and the potential extent of human rights harm that may be caused (UN Documents A/75/212, ¶41).

Thus, it is part of companies' obligations to "identify and anticipate ways in which the company's operations, products, or services impact existing social tensions, and the relationship between various groups, and/or create new tensions or conflicts," (free translation) (UN Documents A/75/212, ¶48) but the actions to be taken are "extremely context-dependent" (free translation) (UN Documents A/75/212, ¶65).

On January 28, 2021, the Oversight Board published its first six cases. With them, we have the first opportunity to understand the rationale that the Board will adopt in regulating the actions of Facebook and Instagram.

Case Decision 2020-006-FB-FBR

The Board overturned Facebook's decision to remove a post that questioned the French health strategy and suggested the use of unproven drugs against COVID-19. While Facebook decided to remove the post because it "contributed to the imminent risk... of physical harm" (free translation), the Board indicated that Facebook's rules are too vague on this point, suggesting the creation of specific regulations for health-related misinformation.

Case Decision 2020-005-FB-UA

The Board overturned Facebook's decision to remove a post in which the user incorrectly attributed a quote to Joseph Goebbels, Minister of Propaganda of the Nazi regime. The quote broadly stated that one should appeal to emotions, not intellectual arguments. The user had used the phrase to talk about Donald Trump two years earlier and shared the post using Facebook's "Memories" function. Once again, the Board decided that Facebook's terms were too vague, making it impossible for users to have clarity on the applicable rules.

Case Decision 2020-004-IG-UA

The Board overturned Facebook's decision to remove a post by a user in Brazil who had posted a photo on Instagram with a title in Portuguese, aiming to draw attention to breast cancer during Breast Cancer Awareness Month. The post contained photos of breasts and nipples of various women showing symptoms of breast cancer and included descriptions of the respective symptoms. The post was removed by Instagram's automated nudity and sexual activity checking system, a decision that the Board deemed erroneous. After the selection date of this case, Facebook restored the post on the platform, but the Board saw validity in continuing to analyze the case. The decision concluded that there was an exception to the applicability of nudity terms linked to the disease awareness campaign.

Case Decision 2020-003-FB-UA

In its first six cases, the Board confirmed only one decision by Facebook.

This occurred concerning a November 2020 post in which the user posted historical photos of a church in Baku, Azerbaijan. The text fueled a historical dispute between Azerbaijan and Armenia and was published during an armed conflict between the two countries. Specifically, the user suggested that Armenians built the region and that Azerbaijan constantly destroyed its cultural heritage. The post was viewed more than 45,000 times. The decision to remove the post was confirmed by the Board based on Facebook's hate speech rules.

Case Decision 2020-002-FB-UA

The Board overturned Facebook's decision to remove a post containing two photos of a Syrian child of Kurdish ethnicity who drowned while trying to reach Europe in September 2015.

The photos were accompanied by derogatory comments about Muslims and suggested that cases of extremism reduced empathy for the Syrian child who had died. Although the Board recognized that the comments could be considered offensive, they did not reach the necessary level to be recognized as hate speech.

Case Report 2020-001-FB-UA

Although the issue was not decided on the merits, the Board clearly addresses one of the limitations of its decision-making power. The publication in question dealt with a screenshot of two tweets from the former Prime Minister of Malaysia, Dr. Mahathir Mohamed, commenting on acts of violence by Muslims in France. The publication did not contain any description and merely reproduced the Prime Minister's comment. The decision not to express an opinion on the case was based on the fact that the section to be analyzed in the case was a comment on the said publication. With the deletion of the original post, the case became moot.

Other cases are ongoing, including Facebook's decision to indefinitely suspend the accounts of the former U.S. President. The Board will continue to act and oversee Facebook's decisions, and its first cases already indicate a movement toward bringing greater clarity to the rules applicable on its platform. The Oversight Board's agenda can be followed on its dedicated page, which provides constant updates on new cases for review and recent decisions.

 

Originally published in JOTA.

By using our website, you agree to our Privacy Policy and our cookies usage.