APPLE NEWS

Application privacy report for debut in iOS 15.2 beta, communication security code appears [u]


AppleInsider is supported by its audience and can earn a commission as an Amazon Associate and associate partner for eligible purchases. These affiliate partnerships do not affect our editorial content.

Apple continues to introduce new features promised to be included in iOS 15, with the first iOS 15.2 beta version released Wednesday that delivers a more robust app privacy tool and hints at future implementation of an upcoming child safety feature.

Announced at this year’s World Developer Conference in June, Apple’s app privacy report provides a detailed overview of app access to user data and device sensor information. This feature adds an increasing set of tools that represent new levels of hardware and software transparency.

Located in the Privacy section of Settings, the Application Privacy Report provides an insight into how often apps access the user’s location, photos, camera, microphone, and contacts during the current seven-day period. The apps are listed in descending chronological order starting with the last title to access sensitive information or access data from one of the iPhone sensors.

Apple’s new feature also tracks recent networking activities, revealing domains that apps have reached directly or through content loaded into a web view. Users can see in more detail which domains have been contacted, as well as which websites have been visited during the past week.

In addition to the new App Privacy Report user interface, iOS 15.2 beta includes a code with details of communication security features designed to protect children from sexually explicit images, reports MacRumors.

Built into Messages, this feature automatically masks images that machine learning algorithms on the device consider inappropriate. When such content is discovered, children under the age of 13 are informed that a message will be sent to their parents if they look at the picture. Parent notifications are not sent to parents whose children are between 13 and 17 years old.

Messages will display various warnings when images are tagged, according to the code it detects MacRumors Associate Steve Moser:

  • You are not alone and you can always get help from an adult you trust or from trained professionals.
  • You can also block this person.
  • You are not alone and you can always get help from an adult you trust or from trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is useful if it is incorrectly marked as sensitive.
  • Send a message to an adult you trust.
  • Hey, I’d like to talk to you about a conversation that’s bothering me.
  • Sensitive photos and videos show private body parts covered by swimsuits.
  • It’s not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share this. How would you feel if other people saw it?
  • The person in this may not want to see it – it could have been shared without her knowing. Sharing can also be illegal.
  • Sharing acts with people under the age of 18 can have legal consequences.
  • If you decide to take a look at this, your parents will receive a notification to make sure you are well.
  • Don’t share anything you don’t want. Talk to someone you trust if you feel pressured.
  • Are you feeling well? You are not alone and you can always talk to someone who is trained here to help.

According to the report, the mechanism changes the exchange of messages depending on the age of the user:

  • Nude photos and videos can be used to hurt people. Once something is shared, it cannot be returned.
  • It’s not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you are sending this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone – it may never go away. Sharing can also be illegal.

Communication security is part of an initiative that seeks to limit the dissemination of child sexual abuse (CSAM) material on major Apple platforms. Announced in August, the triple effort includes the Messages feature, Siri updates and searches, and a CSAM detection system for photos stored in iCloud. The latest feature, which hashes and matches user photos tagged for upload to iCloud with the hash database of the well-known CSAM, has led to significant rejection by industry experts and privacy activists, prompting Apple to postpone launch in September.

Update: Apple informed MacRumors that Communication Safety will not debut with iOS 15.2, nor will it be released as announced.



Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button