Manzana goes after users who have on their devices and in iCloud images of child sexual abuse, we tell you how it will scan iPhones.
Manzana It is recognized as the brand of greater security and privacy for users. Now the company puts limits on privacy as child abuse must be public.
DO NOT STOP READING: The Mexican Soccer Team wins the Bronze in Tokyo and takes the ‘Silver’ Google
Thus, before an image is stored in ICloud photos, the technology will match material about child sexual abuse, CSAM for its acronym in English already known.
Manzana said if a match is found, then a human reviewer will evaluate and report the user to the authorities. Although there are privacy concerns, the technology could expand to scan phones for banned content or even political speeches.
Secondly, some experts worry that authoritarian governments may use the technology to spy on its citizens, however for now the priority is the protection of minors.
How Apple Will Scan iOS Devices
Apple said that the new versions of iOS and iPadOS, to be released in late 2021, will have “new crypto applications to help limit the spread of CSAM online, while being designed for user privacy. “
Scanning technology will work by comparing images with a database of child sexual abuse images compiled by the US National Center for Missing & Exploited Children and other child safety organizations.
Those images translate into “hashes”, which are numeric codes that can be matched to an image on an Apple device. The company says the technology will also capture edited but similar versions of original images.
Manzana stated that the system has an “extremely high level of accuracy and guarantees less than one in a trillion chances per year of incorrectly marking a given account.”
Each report will then be manually reviewed to confirm that there is a match. Then, they will be able to take action how to disable a user’s account and inform the authorities.
The company says the new technology offers “significant” privacy benefits over existing techniques, as Apple only learns about users’ photos if they have images with child sexual abuse content, CSAM, in their iCloud Photos account.