This facial recognition website can turn anyone into a cop — or a stalker

 

Despite the controversy surrounding Polish-based facial recognition software PimEyes, an extensive test of the search engine shows that it has trouble identifying ordinary people.  

Among more than 25 searches conducted by the British Daily Mail, the AI-powered system faced varying levels of problems with 70% of images, including edited or slightly blurred images.

The app identifies celebrities fairly accurately, but only 25% of the results are completely accurate for regular people.

Serious security threat

But cybersecurity experts consider PimEyes a "serious security threat" because it provides information on the social media accounts of people whose faces have been identified.

Some of the matches included URL's to the individual's Instagram, TikTok, Tumblr and Facebook, along with personal blogs. 

Those looking to 'stalk' someone using PimEyes may be able to find their target, but will have to sift through a trove of pornographic images. 

Approximately 15 % of search results popped up with explicit images that link back to the original adult content site.  

Free and paid services

The site, which uses more than 900 million photos to find people, allows users to do reverse image searches and find pictures on the Internet that are believed to be the same person.

For free search, it provides a generic name for the site where the previous image was found, but the site indicates that it offers privacy features, including protection from "fraudsters, identity thieves, or people who illegally use other people's photos".

A paid subscription provides additional details, such as the link in which the image was found and the page addresses and personal accounts associated with it.

Abusive and harmful uses

Despite test results, which the Daily Mail described as lackluster, James Knight, chief analyst at Digital Warfare, said the service was "so open" that it could be misused.

'Although it is marketed for individuals to search for their own image, anyone can search for your image,' Knight said via email on Friday. 'The main fear is that this will be used maliciously by stalkers to find more information on you.' 

Knight continued: 'One quick photo taken of someone could be uploaded to reveal potentially hundreds of photos of them, and from there, their name, address, phone numbers, email addresses. This tool is one more level in the eroding of personal privacy.'

Despite the advantage of providing a free service that uses artificial intelligence and machine learning algorithms, the company's paid offer goes into more detail, providing, for example, the URLs of the links in which other images appear.

User fraud

For her part, technology researcher in London, Stephanie Hare, told the "Washington Post" that it is possible for strangers to monitor the lives of others, and there is literally nothing to prevent them from doing so.

"People who put pictures of themselves on the Internet, with their children or parents and people who could be at risk in their lives, did not do so to find themselves feeding a database that companies could invest in," she added.

Violation of privacy

Aaron DeVera, a security researcher in New York, said PimEyes was used by people on other websites to track down and identify women.

PimEyes has also been criticized in the past for potentially violating the European Union's General Data Protection Regulation (GDPR) for facial recognition.

But the company says PimEyes was created as a "multi-purpose tool" that allows people to track their faces online, restore rights to photos, and monitor their online presence.

Security services

Dave Gershgorn recently told OneZero that the PimEyes app also offers contracts to law enforcement agencies, with a company called Paliscope targeting law enforcement agencies to allow it to use facial recognition technology through photos and files to help identify suspects in cases and crimes.

Paliscope also recently partnered with the 4theOne Foundation, an organization dedicated to finding trafficked children.

Similar applications

In 2020, Russian search engine Yandex has been accused of providing a disorganized facial recognition system and violating personal privacy.

Yandex, which claims to perform more than 50% of Russian searches on Android, allows users to enter an image and see results of the exact same person.

Another Russian company, NtechLab, launched a service called FindFace in 2016, and it works in a similar way to PimEyes, but after its widespread use, the uses of that technology were limited to the state's efforts for security checks only.