Social media platforms like Facebook “have played an important role in exacerbating the political polarization that can lead to such extremist violence,” according to researchers at New York University’s Stern Center for Business and Human Rights.
This may not seem like a surprising conclusion, but Facebook has long tried to downplay its role in encouraging division. The company says existing research shows that “social media is not the primary driver of harmful polarization.” But in their report, NYU researchers write that “research narrowly focused on the years since 2016 suggests that the widespread use of major platforms has exacerbated partisan hatred.”
To confirm this, the authors point to a number of studies examining the links between polarization and social media. They also interviewed dozens of researchers, and at least one Facebook CEO, Yann Le Cun, Facebook’s top AI scientist.
Although the report carefully points out that social networks are not the “original cause” of polarization, the authors say Facebook and others have “amplified” it. They also note that Facebook’s attempts to reduce divisions, such as removing the emphasis in the News Feed, show that the company is aware of its role. “Insight into polarization would probably have been more productive had the company’s top executives not publicly questioned whether there was any connection between social media and political divisions,” the report said.
“Research shows that social media is not the primary driver of harmful polarization, but we want to help find a solution to it,” a Facebook spokesman said in a statement. “That’s why we are constantly and proactively detecting and removing content (such as hate speech) that violates our community standards and working to stop the spread of misinformation. We reduce the reach of content from sites and groups that repeatedly violate our policies, and connect people with reliable, credible sources for information on issues such as elections, the COVID-19 pandemic, and climate change. ”
The report also raises the question that these problems are difficult to solve “because companies refuse to reveal how their platforms work”. Among the researchers’ recommendations is that Congress force “Facebook and Google / YouTube to share data on how algorithms rank, recommend and remove content.” Platforms that publish data and independent researchers who study them should be legally protected as part of that work, they write.
In addition, Congress should “authorize the Federal Trade Commission to develop and implement an industry code of conduct” and “provide research funding” for alternative business models for social media platforms. Researchers have also initiated several changes that Facebook and other platforms could directly implement, including adjusting their internal algorithms to further draw attention to content polarization and make those changes more transparent to the public. Platforms should also “double the number of human content moderators” and make them all employees, so that decisions are more consistent.
All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories involve partnerships. If you purchase something through one of these links, we can earn a commission for affiliates.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.