After the rumors that emerged in the last hours, Apple made official (via 9to5Mac) a series of features that are intended to protect minors who use their devices. Among them is that of scan iPhone and iCloud photos in order to detect child abuse material. On the other hand, security in Messages will be strengthened and information and help resources for children will be offered through Siri and Search. It is worth mentioning, of course, that this entire initiative will only begin in the United States.
Scanning photos without compromising privacy
Apple will use a complex image review system that will allow it to identify child abuse material. Pornography, for example. However, those of Cupertino make it clear that this initiative was designed with privacy as the main axis. Before an image is uploaded to iCloud, iPhone will run a completely local image comparison process. That is, the image is not sent to external servers for analysis.
Apple relies on a number of cryptographic methods known as “CSAM Hashes”, with which it is possible to identify child abuse material without compromising privacy. And it is that the analysis is not done on the visual part of the image, but on the hash from the same. What is a hash? It is a data block generated from another input block. In this case, the input is the images.
Without going into too technical terms, the iPhone will make the comparison based on a database of hashes of material previously identified as child abuse. If a match is found, an encrypted alert will be generated that protects the result of the comparison. Not even Apple can see it.
Now, if a certain number of matches is reached —they did not specify the number—, Apple will be able to manually analyze the material to verify that, indeed, they are images of child abuse. Here it is important to indicate that this manual verification is not done by intervening the iPhone or extracting the images remotely, but in the photos that have already been uploaded to iCloud. Remember that this entire process occurs before the image is backed up to the cloud.
Apple will strengthen the security of Messages
In what corresponds to MessagesWhen a child under the age of 13 belongs to a family on iCloud, there will be reinforced security measures. For example, if you receive a sexually explicit image, it will look blurry. If you insist on viewing it by pressing the “View Photo” button, you will see more information about the “censorship” and your parent or guardian will be alerted quickly. The warning will be similar when trying to submit a sexually explicit photo. To identify whether the material meets those conditions, Apple will use machine learning. The news in Messages will arrive until the end of this year with future updates to iOS 15, iPadOS 15 and macOS Monterey —in the United States—.
To end, Apple wants Siri and Search to be more useful while providing valuable resources to “help children and parents stay safe online and get help with unsafe situations.” In addition, they add that “Siri and Search are also being updated to intervene when users make inquiries related to child abuse material. These interventions will explain to users how harmful and problematic they are, and will provide resources to get help with the problem” .