PimEyes is currently one of the artificial intelligences that represents the most danger to anyone’s privacy. In fact, could prove even more damaging than Clearview AI’s controversial technology; and it is that unlike the latter; PimEyes is designed to delve into your darkest secrets.
One of the most alarming aspects of PimEyes is its cost. For just $29.99 a month, you can search anyone’s past. This ‘tool’, as its current owner has called it, has been specifically designed to find results in “news articles, wedding photography pages, review sites, blogs, and porn sites“, reports the New York Times.
In addition, the results are strangely successful. The chances of you finding who you’re looking for are pretty high. This also adds to its danger, since the times it is wrong, it usually shows results from porn pages. For obvious reasons, can be alarming to anyone who sees himself reflected in these images; even if they end up being real.
To search PimEyes, all you need is a subscription and a photo. Just by uploading the image, the platform will do the rest for you, scanning the web for potentially successful results.
The biggest problem is, as we discussed earlier, the sites that PimEyes uses to search. Unlike other AI technologies like Clearview, this tool does not search within social networks. In this way, you will not find content published on Twitter or Instagram. Instead, you’ll see results from news, articles, magazines, and for some reason, porn sites.
From the New York Times they have data about the searches that users perform. For example, they comment that one of them –who explicitly asked to remain anonymous-, “he said he used the tool to find the real identities of porn movie actresses, and to search for explicit photos of her friends on Facebook”.
It is not necessary to add much more to point out the dangers that a ‘tool’ of this type carries, and much more within the reach of anyone. In fact, Clearview AI was embroiled in a scandal for its intentions to create a recognizable database of every human being on the planet. Just imagine if a tool like PimEyes could access a collection like this; or even organizations and companies with tendencies to violate the privacy and rights of Internet users.
Giorgi Gobronidze, a 34-year-old academic and current owner of PimEyes, believes the platform can be used for ‘good’. According to the businessman, this is just a tool that many people could use to keep an eye on their internet reputation.
PimEyes users are only supposed to be able to search their own faces or those of people who have given consent, Mr. Gobronidze said.
New York Times
At the gates of a dystopia
Something quite common that is often read when talking about facial recognition is, of course, the chances of losing the right to anonymity. With this technology this premise becomes more and more serious. It is not simply being an alarmist for no reason, it is recognizing that PimEyes has a technology that could be exploited by people with dubious intentions.
From authoritarian governments, to criminals looking to extort money. PimEyes is the perfect recipe to start the development of a dystopia worthy of Orwell –at the risk of sounding conspiracy theorist-, and if we combine it with facial recognition tools like those of Clearview AI, much more.
But this is not all. If you wish to exercise your right to anonymity on PimEyes, you will have to pay a fairly high sum. This practice was classified as “extortion” by one of its users, who was seriously affected by the results exposed in the tool.
Cher Scarlett, a computer engineer, was one of the victims of PimEyes. During one of her walks on the internet, the woman decided to give the facial recognition tool a try and see what he could find on the web about his past. The result was something that would not have been expected.
Between the pages of PimEyes, Scarlett found something from her past that she would have preferred to forget. At the age of 19, the engineer decided to give the porn industry a try due to their financial situation at the time. After a harrowing interview, Scarlett scrapped the idea; but it seems that this was not the end of the story.
There it was, in front of her screen, the trauma that had taken her decades to bury. Also, it wasn’t just the pictures. You could also find the links that led anyone to discover where the photos were housed. Of course, along with these explicit images, other photos and portraits of the woman also coexisted, who had appeared in the media due to an employee revolt that she led in the Apple offices.
“Exclude from public results“was one of the buttons that Cher Scarlett found when desperately trying to remove the image from the web. Relieved, she decided to try her luck and touch it, perhaps that way it would disappear completely from the results and her image would not be compromised.
Unfortunately, this is not how PimEyes works. Scarlett found out the hard way, seeing how the website offered an invitation to subscribe to the PROtect service, at a cost that ranges from $89.99 to $299.99 a month. Forking out this sum was the only way to make the result disappear from the page.
PimEyes has tens of thousands of subscribers, according to Mr. Gobronidze, with most of the site’s visitors coming from the United States and Europe. Most of their money comes from subscribers to their PROtect service, which includes help from PimEyes support staff to remove photos from external sites.
NYT
A facial recognition AI that can turn against you
Clearview AI has been facing serious consequences in the United States, Europe, and Australia. In fact, certain countries of the old continent they have cataloged it as being “illegal”and regions like the UK have required the company to remove all of its residents from its database.
Now, it looks like Clearview has gotten some competition. PimEyes could become the next victim of these organisms, and with good reason. In fact, in 2021 investigations against this platform began from the data protection agency in Germany. Although the investigation continues, it could end quite badly for PimEyes.
It is clear that regulations around artificial intelligence and facial recognition technology need to be tightened. If we don’t, we could end up with the –already if– little user privacy. After all, every day millions of people post all kinds of images and information on the internet. Worse, there are people who do it for themand the ignorance of the victims.
Of course, users should also educate themselves on the subject and limit the details they offer on the web. But, as world bodies, governments also they have quite important obligations in this matter. In fact, your participation is the most crucial of all.