The Wall Street Journal’s ‘Facebook Files’ report, which details the various insights leaked from Facebook’s headquarters, has certainly sparked a whole host of concerns for the Social Network.
Among the various issues highlighted in the multi-part investigation was the idea of Facebook keeping celebrities and high-profile users to different standards, with a separate moderation team double-checking their posts and updates and potentially leaving them left when they needed to be removed if they were out of everyday life. people.
Facebook has previously described this double-checking process, which it claims ensures it makes the right decision, ”so that [posts] they were not mistakenly removed or left upstairs”, While Facebook also denied that the process offers special treatment to these pages.
But nonetheless, Facebook also acknowledges that the system is not perfect, and this week it forwarded the decision on the process to its independent supervisory board, in an effort to establish a better path in assessing such content.
As Facebook explained:
“Facebook browses billions of content every day, has 40,000 people working on security and safety, and has built some of the most sophisticated technologies to help implement content. Despite that, we know we will make mistakes. A cross-checking system is built to prevent potential errors in over-implementation and to double-check cases where, for example, a decision might require more understanding or there could be a higher risk of error. This could include activists raising awareness about cases of violence, journalists reporting from conflict zones, or other high-visibility sites and profiles where proper enforcement is particularly important given the number of people who may have seen it. ”
So again, Facebook says the process is more designed to ensure a high profile and avoid big impact mistakes, which is why it has a secondary verification process in place – so as not to give celebrities more room to post what they like.
Facebook says it is always working to improve this process, and insight into the Supervisory Board will play a role in this refinement.
“Precisely because Facebook was established by the Supervisory Board is held accountable for our content policies and processes. In the coming weeks and months, we will continue to inform the board about our cross-checking system and work with them to answer their questions. ”
To be clear, the Supervisory Board first called on Facebook to provide a better insight into the cross-checking process as a result of the WSJ report:
“In the Supervisory Board, we have been asking questions about cross-checking for some time. In our decision on the accounts of former US President Donald Trump, we warned that the lack of clear public information on cross-checking and Facebook’s “news exception” could contribute to the perception that Facebook is unjustifiably influenced by political and commercial reasons. “
This refers to Facebook’s stance on comments posted by former President Trump, to which Facebook has chosen to take no action because of their news and relevance to the community.
Indeed, in a speech to Georgetown back in 2019, Facebook CEO Mark Zuckerberg underlined this approach, which started the initial reaction in this regard:
“We do not check political ads. We are not doing this to help politicians, but because we think people should be able to see for themselves what politicians are saying. And if the content is worth the news, we won’t even remove it even if it would otherwise conflict with many of our standards.”
Facebook’s position, as always, is wrong on the side of freedom of speech, but recent events have forced this to re-evaluate and broaden the questioning of the role that Facebook has in communication, and its responsibilities in this regard.
That’s why the revelations about his cross-checking system stand out, as they seem to be in line with Facebook’s clear desire to allow more content to be shared in its apps and avoiding having to manage it by police.
Facebook’s broader view is that there should be some form of official regulation in the social media space, and that the platforms themselves should not independently establish such rules. Which is probably a better way, but so far there has been little progress in establishing an independent oversight board, beyond Facebook’s own efforts, which it uses to illustrate the need for such.
Ideally, for Facebook, another regulatory group would take such decisions out of its own hands, but currently the responsibility remains on it, and on every social platform, to decide what is acceptable and what is not, and on the specific parameters that apply to them. .
Which, given that these are independent deals, which they report to their shareholders, doesn’t seem like the best approach, especially since their influence is growing every day.
The final response inevitably seems to indicate independent oversight, but regional variations and other complications also pose significant challenges in this regard.
That is why Facebook has set up its own supervisory board and why it now wants to refer such decisions to them as a kind of release from pressure, while at the same time diminishing the responsibility of its team.
Which in a way may seem like a quick way out – but really, it’s a model where we should go.
Like it or not, Facebook’s own approach might be the best way to address its various concerns.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.