Apple explained that most cloud services already scan user files, looking for content that could violate the terms of use and service or that could be considered illegal. In this sense, he rejected that it is a measure that violates the privacy of users, as some have claimed.
How will Apple detect photographs of child sexual abuse?
NeuralHash, ensures Manzana , is a technology praised by associations that fight against child sexual abuse. This actually works on each user’s device and can identify whether the user uploads known child abuse images to iCloud.
However, these potentially illegal images are not decrypted until they reach a sequence of verifications. Then, the system deletes them from the iCloud account in question and notifies the authorities.
What about privacy at Apple?
This has generated controversy among advocates of user privacy, who argue that if this type of technology fell into the wrong hands, such as those of an authoritarian government, there could be hypervigilance and repression. In fact, this news came after Matthew Green, an expert and professor of cryptography at Johns Hopkins University, revealed the existence of this technology in a series of tweets.
“The way Apple is launching this technology will start with photos that people have already shared with the cloud. So it doesn’t harm anyone’s privacy… But you must be wondering why anyone would develop a system like this if scanning E2E photos (end-to-end) was not the goal, ”Green tweeted. E2E refers to private messages that, presumably, could not be susceptible to surveillance.
However, in a statement, Apple ensures that the photographs in question will only be subjected to manual (human) review when they have passed several levels of verification, through machine learning technology.
In fact, he stressed that he will warn both parents and children when they are receiving or sending sexually explicit photographs. In this way, the moment a child receives this type of content, the photo will be blurred along with a warning.
Finally, Apple reported that these updates will arrive in late 2021 for iOS 15, iPadOS 15, watchOS 8 and macOS Monterey and that this technology will evolve and expand over time. This type of effort to eradicate the spread of illegal content is nothing new: Last year, Facebook reported more than 20 million images of child sexual abuse on its platform.