The technology alters the pixels of the photographs so that they cannot be recognized by algorithms, but not so much that they are perceptible to the human eye.
US researchers from the SAND lab at the University of Chicago They have developed a new tool capable of modifying photographs imperceptibly, so that they cannot be used in an undesired way.
Called Fawkes – name that comes from Guy Fawkes, the face of the ‘V for Vendetta’ mask – it is a computer program that slightly alters the pixels of the image, thus managing to avoid facial recognition that many platforms use.
These changes are invisible to the human eye, but if someone tries to use Fawkes’ modified image in their facial recognition model, they will receive “a highly distorted version” of the original, the developers explain.
The creators of the technology assure that during the tests it has demonstrated 100% effectiveness against some of the most advanced facial recognition systems, including Microsoft Azure Face API, Amazon Rekognition and Face ++, from Chinese tech company Megvii. “The level of protection will vary based on your willingness to tolerate minor adjustments to your photos,” the scientists described.
The Fawkes source code is available on GitHub, and the developers also shared links to the binary files for Mac, Windows, and Linux.
The team stressed that they have no plans to launch any smartphone apps, since the power Fawkes requires is so great that “it would be a challenge” even for “more powerful” mobile devices.
Why worry about facial recognition?
Many technology companies are betting on facial recognition systems as one of the most reliable methods to verify the identity of users. For example, when it comes to unlocking digital devices, be it a mobile phone or a computer. However, not all of these systems are as invulnerable as it sounds.
Last January, it was revealed about the facial recognition company Clearview AI, which has a database of more than 3,000 million photographs secretly drawn from various sources – including Facebook, YouTube, and Instagram – and providing services to more than 600 US security agencies, in addition to numerous private companies.
“Clearview demonstrates how easy it is to create invasive tools for monitoring and tracing,” said the creators of Fawkes.
In parallel, they noted that their new tool is not an answer specifically to Clearview, as they believe that “it is probably just the (rather large) tip of the iceberg“.
“Fawkes is designed to significantly increase the costs of developing and maintaining accurate models for large-scale facial recognition. If we can reduce the precision of these models to be unreliable, or force model owners to pay significant costs per person to maintain accuracy, then we will have been very successful, “said the developers.
If you found it interesting, share it with your friends!