How many photos of yourself do you reckon are lurking online? Well, using the PimEyes webpage, you can (ostensibly) find an exact tally at the tap of a touch screen. Submit a clear image of your front profile, and the AI facial recognition software will match your biometrics to other face shots in its collection. The results could be anything from your professional profile pics to the blurry ‘Blonde to my brunette’ snaps you and your best friend posted in 2014.
Admittedly, I was skeptical upon first uploading my face to the site. Like most GenZers, I have fallen prey to countless ‘Find Your Long-Lost Twin’ and ‘Meet Your Lookalike’ tests – none of which proved successful. PimEyes, by contrast, speedily spat out three photos of me and several others of mysterious lookalikes. I was impressed if not slightly taken aback. I remain unsure where one image featuring a fringed me smiling at the camera even came from.
‘Using the latest technologies, artificial intelligence and machine learning, we help you find your pictures on the Internet and defend yourself from scammers, identity thieves, or people who use your image illegally,’ the start-up’s website claims. ‘That is why we have created PimEyes, a multi-purpose tool allowing you to track down your face on the Internet, reclaim image rights, and monitor your online presence.’
The company’s facade as a crusader for personal privacy is hardly convincing. It would be naive to assume that all users upload photos of themselves to the site. Searching ‘PimEyes’ on X reveals a whole host of users recommending PimEyes to others as a way of tracking down crushes, internet enigmas and even NGO agents.
Initial searches are free of charge – albeit producing limited results. For more thorough stalking, PimEyes’ advanced package offers ‘unlimited searches daily’ and ‘up to 500 PimEyes alerts’ for £284.30 per month.
The company hotly maintains that it does not scrape from social media nor allow under-18s to use its software. Speculation continues about its compliance with EU General Data Protection Regulation protocol.
As one X-user commented, ‘I am not a fan of the “trust us” security policy.’
‘Honestly, it has only really been a few centuries since people got used to the idea that Earth was not flat,’ PimEyes owner Giorgi Gobronidze told London Economic. ‘In the 17th century and 16th century, people ended up in an inquisition for saying that the Earth was round. Even television, when it was invented, was perceived as a box of a devil. Even in my childhood, I heard preaching in church that TV and modern technology were a young evil. And yes, I know that people will be concerned and will be afraid of AI technologies.’
PimEyes is not the only facial recognition software brainstormed in recent years.
Clearview AI made headlines in 2020 for its 3-billion-face stockpile, allegedly used for US law enforcement. Company CEO Cam-Hoan Ton-That was subsequently linked to far-right Trumpism.
Surveillance start-up Banjo found itself in similarly hot waters that same year. Founder Damien Patton was revealed to have driven a leading KKK member to and from a Nashville synagogue attack in 1990. According to court records, Patton proceeded to testify, ‘We believe that the blacks and the Jews are taking over America, and it’s our job to take America back for the white race.’
Banjo had been in the middle of a $12 million deal with Utah’s justice department.
“ChatGPT started the revolution. I don’t know how it’ll end up because it is already shaping a world which is based and built on respective copyright and other ways of thinking,” Gobronidze responded to London Economic’s queries. “Very soon, we will have books written by artificial intelligence. We already have images made by artificial intelligence. The world has already outlived, survived and evolved with the First Industrial Revolution. Now we have the second one.”
Gobronidze may be optimistic about the future of open-door AI facial recognition technology. But, by allowing anyone to upload a photo of anyone, PimEyes risks spewing personal histories willy-nilly.