Updated
2020-06-05T14:14:00Z
Loading Something is loading.
Thanks for signing up!
Access your favorite topics in a personalized feed while you're on the go.
By clicking “Sign Up,” you also agree to marketing emails from both Insider and Morning Brew; and you accept Insider’s
Termsand
Privacy Policy.
Click herefor Morning Brew’s privacy policy.
Smile! You're on camera — or you were at some point in the past few years — and now your face is public domain.
Facial recognition technology is everywhere, and only becoming more pervasive. It's marketed as a security feature by companies like Apple and Google to prevent strangers from unlocking your iPhone or front door.
It's also used by government agencies like police departments. More than half of adult Americans' faces are logged in police databases, according to a study by Georgetown researchers. Facial recognition technology is used by governments across the globe to identify and track dissidents, and has been deployed by police against Hong Kong protesters.
To push back, privacy-focused designers, academics, and activists have designed wearable accessories and clothes meant to thwart facial recognition tech. Demonstrators from Hong Kong to the US have used the masks to remain anonymous at protests, and encrypted messaging app Signal has even started distributing free anti-facial recognition masks to George Floyd protesters.
Facial recognition software uses artificial intelligence to detect faces or human figures in real-time. But that software is fallible — clothing can "dazzle" the software with misleading shapes that stop the AI from knowing what it's looking at. Other designs confuse AI with images of decoy faces, preventing it from making the right identification.
These designs are still niche, and have mostly only appeared as art installations or academic projects. But as facial recognition becomes more widespread, they may catch on as the next trend in functional fashion.
It should be noted, however, that the anti-facial recognition designs are not failproof, and some algorithms are already being developed to overcome them.
Here are the ingenious, bizarre designs meant to outsmart facial recognition tech.
The "surveillance exclusion" mask was designed by Jip van Leeuwenstein while he was a student of Utrecht School of the Arts in the Netherlands.
"Because of its transparency you will not lose your identity and facial expressions," von Leeuwenstein writes, "so it's still possible to interact with the people around you."
Jing-cai Liu created the wearable face projector: A "small beamer projects a different appearance on your face, giving you a completely new appearance."
Images of Liu's face projector went viral last year after misleading tweets claimed it was being used by protesters in Hong Kong. This claim was later debunked.
Isao Echizen, a professor at the National Institute of Informatics in Tokyo, designed the "privacy visor" as a safeguard against security cameras that could log someone's face without their permission.
The device is fitted with "a near-infrared LED that appends noise to photographed images without affecting human visibility."
When switched on, a user's face no longer scans as a human face to the AI, indicated by the lack of green boxes above.
"The full face mask Pixelhead acts as media camouflage, completely shielding the head to ensure that your face is not recognizable on photographs taken in public places without securing permission," creator Martin Backes writes.
The technique gets its name from a World War I tactic — naval vessels were painted with black and white stripes, making it harder for distant ships to tell their size and the direction they were pointed.
"By giving an overload of information software gets confused, rendering you invisible," Weekers wrote of the scarf.
Belgian computer scientists Simen Thys, Wiebe Van Ranst, and Toon Goedemé designed "adversarial patches" as part of a study funded by KU Leuven.
"We believe that, if we combine this technique with a sophisticated clothing simulation, we can design a T-shirt print that can make a person virtually invisible for automatic surveillance cameras," the researchers wrote.
"'Facial Weaponization Suite' protests against biometric facial recognition — and the inequalities these technologies propagate — by making 'collective masks' in workshops that are modeled from the aggregated facial data of participants, resulting in amorphous masks that cannot be detected as human faces by biometric facial recognition technologies," creator Zach Blas writes.
Blas intended the masks pictured above to depict the "tripartite conception of blackness: the inability of biometric technologies to detect dark skin as racist, the favoring of black in militant aesthetics, and black as that which informatically obfuscates," he writes.
Read next
Features Tech AIMore...