The Facebook whistleblower provides contradictory motivations in the company’s approach to news content

Is Facebook bad for society and does the company consciously contribute to division and anger to increase usage and profits?

This is a key issue that has persisted over the past few years, especially after the 2016 U.S. election. And now we have an insight into Facebook’s own thoughts on the subject – in the last two weeks The Wall Street Journal reported a series of internal studies and responses to them by Facebook executives leaked by a former Facebook employee seeking to expose the company’s inaction in addressing key issues. shortcomings in its design.

CBS revealed last night that former employee Frances Haugen, an algorithmic design expert who worked on Facebook’s civic integrity team before it was disbanded ahead of the 2020 U.S. election. According to information shared by Haugen, Facebook did indeed knowingly avoid addressing the worst aspects of your platform, due to the impact that such moves could have on usage and thus on profits.

And while Facebook has refuted Haugen’s claims, her statements are in line with what many previous reports have suggested, highlighting key concerns about the social influences of Zuckerberg’s social giant.

Haugen’s key claim is that Facebook has consciously overlooked or downplayed results based on its research, in favor of maintaining usage and user engagement.

As Haugen explained:

“What I saw over and over again on Facebook was a conflict of interest between what is good for the public and what is good for Facebook. And Facebook has repeatedly chosen optimization for its own interests, such as making more money. ”

Which, to some extent, makes sense – Facebook is, after all, a business, and as such, it is driven by profit and brings maximum value to its shareholders.

The problem, in the case of Facebook, is that it manages the latest interconnected human network in history, shutting down 3 billion users, many of whom use the app to stay informed, in various fields and gather key insights about the news of the day. As such, it has significant power to influence opinion.

This means, as Haugen notes, that his decisions can have a big impact.

Facebook makes more money when you consume more content. People like to deal with things that provoke an emotional reaction. And the more anger they are exposed to, the more they communicate and consume.

Indeed, among the various findings highlighted in Haugen’s Facebook files, the thousands of internal documents she basically smuggled from Facebook headquarters are suggestions that Facebook has:

  • They overlooked the prevalence and impact of hate speech on their platforms, due to the fact that such content also encourages greater user engagement
  • They have mitigated the negative impacts of Instagram on young users, and the findings show that the platform amplifies the negative body image
  • It has failed to address major concerns about the use of Facebook in developing regions, in part due to cost-benefit analysis
  • Failed to address the spread of anti-vaccine content

Again, many of these elements have been widely reported elsewhere, but Haugen’s dossiers provide direct evidence that Facebook is indeed well aware of each of these aspects and has decided from time to time not to act or take significant countermeasures, mainly due to conflicts with their business interests.

Facebook’s public relations team works hard to counter such claims, giving point-to-point answers to every Facebook file report, noting that the existence of these research reports, in itself, shows that Facebook is working to address such issues. to combat these problematic elements.

Facebook points to various changes it has made to Instagram to provide users with more protection and control options, while Facebook is also working to improve its algorithm to limit exposure to content that causes discord.

But at the same time, Facebook has downplayed the impacts of such on a broader scale.

Like Facebook vice president of politics and global affairs, Nick Clegg noted the suggestion that Facebook played a key role in contributing to the post-election protests in the Capitol building.

“I think it’s a claim [that] January 6th can be explained because of social media, I just think it’s funny. ”

Clegg’s view is that Facebook is only a small part of the wider social change and that it simply cannot be the key issue that led to such a major conflict in different regions.

It is impossible to know what the impact of Facebook is in this regard, but it is clear that, based on Haugen’s files, there are some key contributors.

Facebook makes more money when you consume more content. People like to deal with things that provoke an emotional reaction. And the more anger they are exposed to, the more they communicate and consume. ”

Anger is the emotion that elicits the greatest response, the greatest engagement, and Haugen essentially claims that Facebook benefits from it, facilitating the spread of hateful content, which then, as a byproduct, intensifies the division.

When we live in an information environment full of angry, hateful, polarizing content, it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to take care of each other, the version of Facebook that exists today tears our societies apart and causes ethnic violence in the whole world.

This has two sides and both can be equally accurate. One is, as Haugen notes, that Facebook has a fundamental motivation to facilitate the spread of hateful content, which causes greater engagement among its users, and at the same time worsens social division — which, on a Facebook scale, can have a significant impact.

On the other hand, as Facebook notes, such research is not conducted in vain. Completely turning a blind eye to such issues would mean that these studies are not being conducted at all, and although Zuck and Co. they may not take as much action as all parties would like, there is evidence to suggest that the company is working to address these issues of concern, albeit in a more measured manner that, ideally, also reduces business impact.

The question is, should ‘business impact’ be taken into account in such consequential decisions?

Again, Facebook manages the largest interconnected network of people in history, so we don’t know what the full impacts of its sharing under the influence of the algorithm can be, because we have no other example to refer to, no precedent for Facebook and its wider impact.

In a way, Facebook, in terms of its scale and influence, should really be a public aid, which would then change the company’s motivation – as Haugen notes:

No one on Facebook is malicious, but the incentives are mismatched, right? Like, Facebook makes more money when you consume more content. People like to deal with things that provoke an emotional reaction. And the more anger they are exposed to, the more they communicate and consume.

Basically, this is the main issue – we now have a situation where one of the key means of distributing and disseminating information is not motivated by keeping people reliably informed, but by provoking the greatest engagement that can. And the way to do that is to foster an emotional response, with hatred and anger being one of the most powerful motivators that will entice people to react.

According to research, almost a one-third of adults in the United States access the news on Facebook regularly -this means that at least 86 million Americans get a direct insight into the latest developments from a platform that has a clear motivation to show them emotionally charged attitudes on every issue that causes anxiety.

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button