Facebook's misinformation problem is bigger in India
Facebook's misinformation problem is bigger in India

Frances Hawking's leak shows that the problem of Facebook's extremism is particularly serious in some areas. According to documents Hogan provided to the New York Times, Wall Street Journal and other media outlets, the company is aware that it has contributed to disinformation and extreme violence in India.

It appears that social networks do not have sufficient resources to deal with the spread of pollutants in countries with high population density. When the tension flared up, no proper action was taken to address it.

A case study conducted in early 2021 showed that a lot of malicious content from groups like the Indian nationalist armed group Bajrang Dal went unreported and across social networks and apps due to a lack of technical knowledge required to detect content written in Bengali. .

At the same time, it was reported that out of political sensitivity, the company refused to name the Rashtriya Swayamsevak Sangh, a right-wing, volunteer Hindu paramilitary organization and Hindu nationalist.

Although Facebook has internally requested that its documents be removed, Bajrang Dal, who is linked to Prime Minister Modi's political party, was not mentioned. The company has a whitelist of politicians who are exempt from fact-checking.

According to the leaked data, the company has worked hard over the past five months to combat hate speech. Research shows how quickly Facebook's recommendation engine recommends toxic content.

The fake account that followed Facebook's recommendations for three weeks was exposed to near-permanent cross-nationalism, disinformation and violence.

The company said the leak did not explain the whole situation. Company spokesman Andy Stone argued that the data is incomplete and does not reflect the widespread use of a third-party fact-checker outside the United States.

He added that the company has invested heavily in technology to detect hate speech in languages ​​such as Bengali and Hindi and the company will continue to improve the technology.

Facebook says its corrective actions are underway

The social media company defended its approach. She said she has a modern process in place to review and prioritize countries at high risk of violence every six months.

She emphasized that the team considered long-term and historical issues as well as current events. The company added that it is working with local communities to continually improve technology and improve policies.

However, the response did not address some of the concerns. India is the company's largest single market with 340 million people using its services. But 87% of the company's total disinformation budget is centered on the United States.

Although there is an external fact-checker, it shows that India has not received much attention.

The company has not been afraid to turn a blind eye to certain people and groups, but has previously announced that it will implement its policies regardless of positions or affiliations.

In other words, it is unclear whether the disinformation and violence issues on Facebook will improve in the near future.

Previous Post Next Post