Facebook’s misinformation and violence problems are worse in India

Statements by Facebook whistleblower Frances Haugen suggest that her problems with extremism are particularly dire in some areas. Documents submitted by Haugen New York Times,, Wall Street Journal and other media suggest that Facebook is aware that it has fueled serious misinformation and violence in India. The social network obviously did not have nearly enough resources to cope with the spread of harmful material in a populous country, and it did not react with sufficient action when tensions flared up.

A case study from early 2021 found that most of the harmful content from groups such as Rashtriya Swayamsevak Sangh and Bajrang Dal was not tagged on Facebook or WhatsApp due to a lack of technical knowledge needed to spot content written in Bengali and Hindi. At the same time, Facebook reportedly refused to mark the RSS feed for removal due to “political sensitivity”, and Bajrang Dal (affiliated with Prime Minister Modi’s party) did not touch it despite Facebook’s internal call to remove its material. The company had a white list for politicians exempt from fact-checking.

According to leaked data, Facebook struggled to fight hate speech five months ago. Like a previous test in the U.S., research has shown how quickly Facebook’s referral mechanism suggests toxic content. The fake account that followed Facebook’s recommendations for three weeks was subjected to an “almost constant salvo” of nationalism, misinformation and violence.

As with previous research, Facebook said the leaks don’t tell the whole story. Spokesman Andy Stone argued that the data was incomplete and did not take into account third-party fact-checks often used outside the United States. He added that Facebook has invested heavily in hate speech detection technology in languages ​​such as Bengali and Hindi, and that the company continues to improve that technology.

The social media firm followed this by announcing a longer defense of its practices. It is said to have an “industry-leading process” for reviewing and prioritizing countries at high risk of violence every six months. He notes that the teams considered long-term issues and history along with current events and dependence on its applications. The company added that it engages with local communities, improves technology and continuously “refines” policies.

However, the response did not directly address some of the concerns. India is Facebook’s largest single market, with 340 million people using its services, but 87 percent of Facebook’s disinformation budget is directed to the United States. Even with checking the facts of third parties at work, this suggests that India is not receiving a proportionate amount of attention. Facebook also did not address concerns about revolving around certain people and groups other than a previous statement that it pursued its policies regardless of position or association. In other words, it is not clear that Facebook’s problems with misinformation and violence will improve in the near future.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn a commission for partners.

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button