Apple continues to work on security, and recently a new system for security was unveiled. detection of material with aspects of child abuse.
Apple Against Child Abuse
According to a report mentioned ago, Apple would announce a new tool with hashing algorithms for the photographic detection of content related to child abuse. All this through a new system that it would be installed on the user’s device and that it would match the images with illegal material to find matches.
This new system would work through fingerprints that do not violate the privacy rules currently used by the company and that would intervene in uploading files to iCloud Photos in order to prevent the spread and safeguarding of improper material.
Now it is a reality. Officially, the Apple company has announced a new initiative for the protection of children on iPhone, iPad and Mac. With the phrase: “Protecting children is an important responsibility”. Apple announces 3 new features for this new advance in security.
CSAM detection
By CSAM Apple refers to content that depicts sexually explicit activities involving a child. For this it has implemented a new recognition method for user content in order to find matches through the previously mentioned hashing process and thus be able to report CSAM cases to the National Center for Missing and Exploited Children.
«Before an image is stored in iCloud Photos, a matching process is performed on the device for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private Set Intersection (PSI) lets Apple know if an image hash matches known CSAM image hashes, without learning anything about mismatched image hashes. PSI also prevents the user from knowing if there was a match.
So that there is no error, Apple makes use of a new technology named Threshold, which filters and makes sure the detection is correct and a false CSAM find cannot be created. According to Apple data, the system has an error rate of less than 1 in 1 trillion accounts per year.
Messages
One of the most important news for the safety of minors comes with the messages. Now the children who have Apple devices linked to a family via iCloud They will have warning notices in their messages when receiving or sending any image that may be sexually explicit.
That’s right, every time a minor receives an image that may be sensitive, it will appear blurred and the app will send a warning notification immediately. In this warning, the minor will be announced why it is improper and if they agree to see it iCloud parent will receive a notification “To make sure you’re okay.”
The same will happen when the minor tries to send a photo considered inappropriate for being sexually explicit.
Apple mentions that the system begins machine learning that helps determine sensitive content without breaking the end-to-end encryption agreement with which the service has, therefore conversations will remain private, but safer for children.
Searches and Siri
Finally the Siri functions and search for Apple devices have been adapted to offer help on the issue of child abuse.
«Apple is also expanding guidance on Siri and Search by providing additional resources to help kids and parents stay safe online and get help in unsafe situations. For example, users who ask Siri how they can report child or child sexual abuse or child exploitation will be directed to resources on where and how to report.
Siri and Search are also being updated to intervene when users search for CSAM-related queries. These interventions will explain to users that interest in this topic is harmful and problematic, and will provide partner resources for help with this topic.
These new features will be implemented in the updates of iOS 15, iPadOS 15 and macOS Monterey, will begin in the United States and will expand as soon as possible to other regions.
There is no doubt that this is a big step for social welfare, for the control and care of minors who use devices and something that could selflessly affect the success of Apple, which today has become the sixth company that earns the most money today.