In iOS 15.2 beta, Apple introduced the new Messages Communication Safety option, which is designed to protect children online by protecting them from potentially harmful images. We saw a lot of confusion around this feature and felt that it might be helpful to provide an overview of how communication security works and clarify misconceptions.
Communication security review
Communication security aims to prevent minors from being exposed to unwanted photos that contain inappropriate content.
As Apple explained, Communication Safety is designed to detect nudity in photos sent or received by children. iPhone or iPad scans images on the device in the Messages app, and if nudity is detected, the photo is blurred.
If a child touches a blurred image, the child is told that the image is sensitive and shows “parts of the body that are usually covered with underwear or bathing suits.” The feature explains that nude photos can be “used to hurt you” and that the person in the photo may not want to be seen if they are shared without permission.
Apple also offers children ways to get help by sending messages to an adult they trust in their lives. There are two touch screens that explain why a child may not want to see a nude photo, but the child may still choose to look at the photo, so Apple does not restrict access to the content, but provides guidance.
Communication security is fully included
When iOS 15.2 is released, Communication Security will be an option that can be enabled. It will not be enabled by default, and those who use it will need to explicitly enable it.
Security of communication is for children
Communication security is a parental control feature enabled through the Family Sharing feature. With Family Sharing, adults in the family can operate devices under the age of 18.
Parents can engage in Communication Security using Family Sharing after updating to iOS 15.2. Communication security is only available on devices installed for children under the age of 18 that are part of the Family Sharing group.
Children under the age of 13 cannot create an Apple ID, so creating an account for younger children must be done by a parent using Family Sharing. Children over the age of 13 can create their own “Apple ID”, but can still be invited to the Family Sharing group under parental supervision.
Apple determines the age of the person who holds the “Apple ID” according to the date of birth used in the account creation process.
Communication security cannot be enabled on adult devices
As a Family Sharing feature designed exclusively for Apple ID accounts owned by a person under the age of 18, there is no option to activate security of communication on an adult-owned device.
Adults do not have to worry about the security of messaging unless they are the parents who manage it for their children. In the Family Sharing group, which consists of adults, there will be no Communication Security option, nor will scanning photos in Messages be done on an adult device.
Messages remain encrypted
Communication security does not compromise the end-to-end encryption available in the Messages app on the iOS device. Messages remain fully encrypted, and the contents of the Message are not sent to another person or Apple.
Apple does not have access to the Messaging app on child devices, nor is Apple notified if and when Communication Security is enabled or used.
Everything is done on the device and nothing leaves the iPhone
For security of communication, images sent and received in the Messages app are scanned for nudity using Apple’s machine learning and AI technology. The scan is done entirely on the device and no content from Messages is sent to Apple’s servers or anywhere else.
The technology used here is similar to the technology that Photos uses to identify pets, people, food, plants, and other objects in images. All this identification is done on the device in the same way.
When Apple first described the security of communications in August, there was a feature designed to notify parents if children decide to look at a nude photo after being warned about it. This has been removed.
If a child is alerted to a nude photo and still looks at it, the parents will not be notified and full autonomy is in the child’s hands. Apple removed this feature after criticism from advocacy groups worried that it could be a problem in parental abuse situations.
Communication security is not Apple’s measure against CSAM
Apple originally announced Communication Safety in August 2021, and was introduced as part of the Child Safety feature package that included an initiative against CSAM.
Apple’s anti-CSAM plan, which Apple described as being able to identify child sexual abuse material in iCloud, has not been implemented and is completely separate from communication security. It was a mistake for Apple to introduce these two features together because one has nothing to do with the other except that both were under the child safety umbrella.
There has been a lot of criticism over Apple’s measure against CSAM because photos posted on “iCloud” will be scanned against a database of known child sexual abuse material, and Apple users are not happy with the ability to scan photos. There are concerns that the technology Apple uses to scan photos and pair them with known CSAM could be extended to other types of material in the future.
In response to widespread criticism, Apple has postponed its anti-CSAM plans and is making changes to the way they will be implemented before releasing them. No anti-CSAM functionality has been added to iOS at this time.
Release date and implementation
Communication security is included in iOS 15.2, which is currently available as a beta version. The beta can be downloaded by developers or members of Apple’s beta testing program. There is still no word on when iOS 15.2 will be introduced to the general public.
Apple plans to provide new communications security documentation when iOS 15.2 comes out, offering further explanation on how the feature works.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.