Amid the ongoing debate about the potential negative impacts of Facebook’s News Feed algorithm on broader content consumption habits, Facebook is testing a range of new control options, both for individuals and advertisers, that will allow people to influence what they see and help brands avoid unwanted connections via placing ads in the application.
First, for individual users, Facebook wants to make it easier to find existing News Feed control options, while it also wants to give people the capacity to reduce certain types of content in their feeds.
As Facebook explains:
“As part of that, people can now increase or decrease the amount of content they see from friends, family, groups and pages they are linked to and topics they care about in their News Feed settings. ”
Facebook News Feed settings give you more control over what’s displayed in your feed by allowing you to select your favorite profiles that will then have higher priority when they post, stop tracking pages, people, and topics, and delay certain users / pages, all from a consolidated list.
There will soon be even more control options on this list, with the ability to zoom in / out of the content displayed to you from each item – although it’s not yet clear exactly how this will work.
It could be a good way to give people more control over their feed – although, of course, it depends on how many people actually use it, with previous data showing that many people don’t change their Facebook settings, even when there’s a clear reason that they should do so.
Therefore, such updates tend to be profitable for the Social Network, as they put a burden on users, giving them more control, while Facebook itself knows that many will not bother, ensuring that it largely maintains the status quo in use. It can’t do much more in that regard, but hopefully with this new pressure, Facebook will make more of an effort to encourage people to use such controls and maximize adoption and awareness of such tools.
Algorithmic reinforcement was one of the key elements of concern highlighted by Facebook whistleblower Frances Haugen in her various testimonies about the platform’s negative impacts, with Haugen telling the U.S. Senate that social networks should be forced to completely stop using engagement-based algorithms, through the reform of Article 230 law.
As he explains Haugen:
“Facebook [has] admitted publicly that engagement-based ranking is dangerous without integrity and security systems, but then failed to introduce those integrity and security systems in most languages of the world. It separates families. And in places like Ethiopia, ethnic violence is literally flaring up. ”
Haugen’s view is that these algorithms encourage negative behavior, in order to encourage greater engagement, and as Haugen notes, this causes significant damage in various regions, including the United States.
It’s hard to define the real impact of such, but it seems pretty clear that Facebook’s algorithms have changed public discourse, with news publishers in particular working to maximize interaction with their Facebook posts to increase overall reach, often involving sharing more biased, divisive and reasoned content. This then leads to even more anger and controversy.
Providing user controls to limit the impact of such could be a good step, but we’ll have to wait and see the specifics of how Facebook looks to introduce this. FAcebook says it will begin testing new control options with “a small percentage of people, which will gradually expand in the coming weeks.”
In addition, Facebook is also expanding its News Feed topic exclusion controls to a limited number of advertisers running ads in English.
“Controlling the exclusion of advertiser topics allows the advertiser to choose a topic that will help us define how we will display the ad on Facebook, including the News Feed. Advertisers can choose from three topics – News and Politics, Social Issues, and Crime and Tragedy. When an advertiser selects one or more topics, his ad will not be delivered to people who have recently addressed those topics in their News Feed. ”
This essentially allows advertisers to avoid unwanted associations on these topics and their related discussions, which could be a good way for Facebook to convince brands that they will not end up suffering negative impacts as a result of the same.
Facebook says that, in early testing, the exclusions were very successful in ensuring that ads did not run alongside such discussions in the app.
Again, amid a wider debate over the impact of negative interactions on the platform, it makes sense for Facebook to provide more controls, which will help users improve their experience, in line with their own expectations and interests, while providing more security for brands. .
Of course, ideally, if research shows that there is a positive effect overall from such changes, you would hope that Facebook will seek to reduce these negative elements more broadly, but that is another aspect that will need to be considered – and perhaps even forced to investigate further, if regulators adopt Frances Haugen’s recommendations.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.