Finding answers to the problems identified in the latest leaks on Facebook will be a complex process

So what have we learned from the latest discovery of internal Facebook documents and research?

Well, not much, really. Former Facebook engineer Frances Haugen posted an initial set of internal reports from the Social Network last month, highlighting a variety of issues, including his struggle in handling vaccine content, the harmful effects of algorithm changes and the negative effects of Instagram on mental health. on teenagers.

Haugen released another series of reports this week, in coordinated work with various major publications, which expand on these initial claims and add more details on various aspects. And it’s all interesting, no doubt, it all sheds light on what Facebook knows about its systems and how they can sow divisions and anxiety, and their wider social impact. But the findings also largely underscore what we already knew or suspected. That Facebook’s lack of support in the local language has led to increased harm in some regions, that its network is used for criminal activities, including human trafficking, and that Facebook may have given priority to growth over security in some decisions.

All of this was widely known, but the fact that Facebook also knows, and that its own research confirms this, is significant and will lead to a whole new set of actions being taken against the social network, in various forms.

But there are some other valuable notes that we were unaware of, and which are hidden among thousands of pages of internal research insights.

One key element, highlighted by journalist Alex Kantrowitz, relates in particular to the controversial News Feed algorithm and how Facebook has worked to balance concerns with increasing content through various experiments.

The main solution that Haugen pushed out in her first speech at the congress about the leak of Facebook files is that social networks should be forced to completely stop using engagement-based algorithms, through reforms of the law from Article 230, which, in Haugen’s opinion, would change the incentives for engaging social platforms and reduce the damage caused by their systems.

As he explains Haugen:

“If we had adequate supervision, or if we would reform [Section] 230 to make Facebook responsible for the consequences of its deliberate ranking decisions, I think they would get rid of engagement-based rankings. ”

But would that work?

As Kantrowitz reported, Facebook actually conducted an experiment to find out:

“In February 2018, a Facebook researcher completely turned off the News Feed ranking algorithm for 0.05% of Facebook users. “What happens if we delete a ranked News Feed?” they asked in an internal report summarizing the experiment. Their findings: Without the News Feed algorithm, Facebook engagement declines significantly, people hide 50% more posts, Facebook group content climbs to the top, and – surprisingly – Facebook equalizes more money from users who scroll through the News Feed.

The experiment showed that without a content ranking algorithm based on various factors, users spend more time scrolling to find relevant posts, exposing them to more ads, and eventually hiding much more content – which, when you look at a chronological feed, is of no permanent benefit by reducing the likelihood that you will see more of the same in the future. The content of groups has increased because users are more engaged in groups (i.e. every time someone posts an update in the group you are a member of, it could be shown to you in your feed), while many more comments and likes from your friends lead to posts on the page appear in user content.

So negative in general, not a solution that some have praised. Of course, part of this is also based on common behavior, so eventually users are likely to stop following certain sites and people who post a lot, leave certain groups that don’t interest them, and learn new ways to control their food. But it’s a lot of manual effort on the part of Facebook users, and Facebook’s engagement would fall because of that.

You can see why Facebook would be reluctant to take this option, while the evidence here does not necessarily suggest that the feed is less shared as a result. And this is before you consider that fraudsters and sites would learn how to play this system as well.

It’s an interesting insight into a key element of the wider debate over Facebook’s impact, with the algorithm often identified as the thing that has the most negative impact, focusing on engaging content (i.e., argument) to keep people on the platform. longer.

Is that true? I mean, it’s clear that there’s a reason Facebook systems are optimized for content that is likely to make users post, and the best way to provoke a reaction is an emotional reaction, with anger and joy being the strongest motivators. It seems likely, then, that Facebook’s algorithms, intentionally or not, reinforce argumentative posts, which can reinforce divisions. But the alternative may not be much better.

So what is the best way forward?

That is the key element we need to focus on now. While these internal insights shed more light on what Facebook knows and its wider impacts, it is important to consider what the next steps might be and how we can implement better protection measures and processes to improve social media engagement.

What Facebook is trying to do – as Facebook CEO Mark Zuckerberg said in response to the initial leak of Facebook files.

If we want to neglect research, why create an industry-leading research program to understand these important issues? If we don’t care about fighting harmful content, why would we hire so many more dedicated people than any other company in our area – even bigger than us?

Facebook is obviously exploring these elements. The concern then comes down to where her motives actually lie, but also, according to this experiment, what can be done to fix it. Because the complete removal of Facebook will not happen – so what are the ways we can ask to use these insights to build a safer, more open public forum with less division?

This question is far harder to answer and a deeper concern than the large amount of hyperbolic reporting that Facebook is a bad guy.

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button