By means of a public letter addressed to Tim Cook, Apple’s CEO, about a hundred political and rights-related groups around the world expressed their condemnation of the brand’s intentions to analyze iPhones without consent.
Apple’s idea is to scan messages on children’s phones for nude images and adult devices to find possible evidence of child sexual abuse.
“While this Apple procedure is designed to protect children and reduce the spread of child sexual abuse material, we are concerned that it is used to censor and threaten the privacy and safety of individuals around the world. It could have serious consequences for many children, ”says the letter first reported by Reuters.
The campaign is organized by the Center for Democracy and Technology (CDT), a non-profit organization based in the United States.
Although this NGO is the organizer, there are associations from all over the world, such as India, Mexico, Germany, Argentina, Ghana and Tanzania.
Many are concerned about the impact of Apple’s changes impacting nations with different legal systems, including some that already have issues of fighting over Apple’s encryption and privacy.
“It is a big disappointment, a huge annoyance that Apple does this, because it has always been a staunch ally in the defense of encryption,” said Sharon Bradford Franklin, co-director of the Security and Surveillance Project within the CDT.
Among the signatories are several associations in Brazil, where Justice has repeatedly blocked WhatsApp, owned by Facebook, for not decrypting messages in criminal investigations.
A few months ago, the Senate of that country approved a bill that makes it mandatory to be able to see the traceability of messages, which makes it necessary to see their content. This year, a similar law was also passed in India.
According to the Internet Society of Brazil, it worries that “this mechanism will be extended to other situations and other companies.”
Apple clarifications
Hit by the protests, Apple, after the announcement in early August, has offered some explanations and published documents to argue that the risks of false detections are “very low.”
Apple says it will expand the image detection system beyond images of children, although it has not explained how.
Although most of the objections so far point to the scanning of the devices, the NGO letter criticizes a change in iMessage, specifically in family accounts. This modification would attempt to identify and blur nudity in children’s messages, allowing them to see it only if the parents are notified.