Meta joins a new attempt to help protect people from ‘revenge pornography’, where intimate content with them is posted online without their consent.
Meta has processes to help detect and remove vindictive pornography since 2018, but now the company is joining a coalition of aid organizations and technology platforms on a new program that will provide users with an alternative way to track their images online and stop their use on the web. .
As Meta explains:
“Today, Meta and Facebook Ireland support the launch of StopNCII.org with the UK Revenge Porn Helpline and more than 50 organizations worldwide. This platform is the first global initiative of its kind to help people in a safe and secure way who are concerned that their intimate images (photographs or videos of a person showing nudity or being sexual in nature) may be shared without their consent. The UK Revenge Porn Helpline, in consultation with Matt, has developed this platform with privacy and security at every turn thanks to the extensive contributions of victims, survivors, professionals, lawyers and other technology partners. ”
This is how the process works – if you are concerned that your pictures or videos are being shared online without your consent, you can go to StopNCII.org and create a case.
Creating a case involves a ‘digital fingerprint’ of the content in question via your device.
As explained here, your content is not transferred or copied from your device, but the system will scan and create it a ‘hash’, which will then be used for pairing.
“Only the hash is sent to StopNCII.org, the associated image or video remains on your device and is not transmitted.”
From there, the unique hash is shared with the participating technology platforms, now including Meta, for use in detecting and removing any variations of images that are shared or trying to be shared in their applications.
It is a good, coordinated way to tackle what can be a devastating crime, with the names of users and public disgrace, through social media, which can potentially cause long-term psychological and perceptual damage.
And with research showing that 1 in 12 adults in the U.S. is a victim of image-based abuse, with young people being much more affected by it, a critical issue, probably more than many would expect.
The prevalence of vengeful pornography actually increased during the pandemic, in Great Britain charity for domestic violence Refuge reporting a 22% increase in reports of revenge pornography over the past year. Simplified solutions like ‘just don’t take pictures’ are largely misunderstood cultural changes and are not helpful in retrospect in any way, and it is important for Meta and other social platforms to do everything they can to address this growing concern, and provide assistance to affected users.
Wider application of this hash-based system could be a big step in improving such a process and, hopefully, sharing a simplified way of working for victims.
You can find out more about the process here.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.