Facebook's role in the conflict in Ethiopia has attracted attention
Facebook's role in the conflict in Ethiopia has attracted attention

Facebook's board of directors issued a resolution calling on the platform to make an independent assessment of its role in increasing the risk of violence in Ethiopia, as part of a more detailed ruling on an article containing unfounded allegations against Tigrayan civilians.

The ruling came a year after the civil war between the Ethiopian government and rebels in the northern Tigray region.

This war has created a humanitarian crisis that has left hundreds of thousands of people in conditions approaching famine. It has also forced millions of people from their homes.

Facebook has been criticized for its role in the conflict in Ethiopia, and observers have compared the company's role to the genocide of Rohingya Muslims in Myanmar.

The Burmese army's online campaign has inflamed the hatred of the Rohingya minority, which has led to massacres and ethnic cleansing.

Similar rumors and calls for violence are allowed to spread in Ethiopia, despite several Facebook employees warning the company of the risks.

The company's board of directors warned the company about the dangers of freely spreading hate speech and unverified information in conflict areas.

The council examined a message in Amharic from an Ethiopian Facebook user who claimed that the Tigray People's Liberation Front (TPLF) responded to events in Ragakubo and other population centers in the country's Amhara region with the help of civilians in Tigray. Rape and theft.

The watchdog wrote in its statement: The user claimed that the source of his information was an earlier report. He did not mention his name or the names of the people of the earth. He did not even provide any evidence to support his claims. As discussed in this article, rumors of ethnic group involvement in mass atrocities are serious and greatly increase the risk of imminent outbreaks of violence.

The post was originally discovered by Facebook's automated content editing tool. When the platform's Amharic content review team found that it violated the rules for hate speech, it was removed. After escalating the matter to the board of directors, Facebook reversed its decision and released the content.

Facebook Critical Council for Content Recovery

The Censorship Authority canceled the platform's decision to publish the publication. He noted that Facebook violated the ban on violence and incitement. This replaces the hate speech rules the platform previously mentioned.

The organization has raised concerns about the spread of unfounded rumors in areas of violence such as Ethiopia. She said such rumors could lead to serious atrocities, such as the situation in Myanmar.

Francis Hogan cited algorithms in countries like Myanmar and Ethiopia for increasing ethnic violence. The company failed to adequately manage one of the platform's biggest risks.

Hogan told Congress in October that what we see in Myanmar and what we see now in Ethiopia are only the first chapters of a story so frightening that no one wants to read the end of it.

The Supervisory Board also commissioned an independent human rights assessment of the role of Facebook and Instagram in exacerbating the risk of ethnic violence in Ethiopia and their ability to adapt content to the national language.

Previous Post Next Post