Meta, the parent company for Instagram and Facebook, has postponed plans to encrypt messages on the platform until 2023 due to a warning from child safety activists who claim the move would protect abusers and help them avoid detection.
Back in August this year, Facebook announced that it would implement end-to-end encryption for chat messages on Facebook and Instagram. The company is now postponing message encryption until 2023. Justifying the move, Antigone Davis, global security chief at Meta, wrote in Sunday Telegraph:
“We are looking for time to correct this and do not plan to complete the global introduction of end-to-end encryption by default in all our messaging services by sometime in 2023.”
“As a company that connects billions of people around the world and has built industry-leading technology, we are committed to protecting people’s private communication and protecting people online.”
Meanwhile, the National Society for the Prevention of Cruelty to Children (NSPCC) said private messages were “the first line of sexual abuse of children online.” Encryption would only make things worse because it makes it impossible for law enforcement and technology platforms to see messages and combat harassment. End-to-end encryption technology allows only the sender and the appropriate recipient to see the content of the exchanged messages.
Meta’s plans to ensure interaction between its customers have been delayed for another year, but the company first developed interest in the area in 2019. At the time, Zuckerberg reportedly said:
“People expect their private communications to be secure and to be seen only by the people to whom they were sent – not by hackers, criminals, governments that outperform them or even the people who manage the services they use.”
End-to-end encryption was a basic feature for WhatsApp, which is also owned by Meta. Encryption for voice and video calls has recently been implemented on Messenger.
Advocating for encryption, Davis said Meta would detect the abuse using unencrypted data, account information and user reports. WhatsApp has already provided a similar approach for reporting child safety incidents to the authorities. He added: “Our recent review of some historical cases has shown that we would still be able to provide critical information to the authorities, even if those services were encrypted from end to end.” However, the abuse is still widespread in Meta’s applications, which are used by about 2.8 billion people every day. The company has identified and reported 21 million cases of child sexual abuse on its platforms worldwide in 2020. Over 20 million of these reports are from Facebook alone.
Welcoming Meta’s move to shut down its encryption drive, NSPCC’s head of online child safety policy, Andy Burrows, said: “Facebook is right not to continue with end-to-end encryption until it has a proper plan to prevent abuse. children remain undetected on his platforms. ” His opinion resonated with Priti Patel’s statement in April: “We cannot allow a situation in which the ability of law enforcement agencies to deal with heinous crimes and protect the victim is seriously hampered.”
Do you think Meta’s encryption plans would make the platform a safe haven for child sexual abusers? Share your thoughts in the comments below!
[Via The Guardian]
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.