Make-up techniques a potential weapon against unwanted facial recognition surveillance
An example of how make-up can be used to genuinely make a person look old. Photo: Idiap database
Applying make-up to appear older, splashing your face with infrared light and combining multiple photographs to “morph” identities are techniques people are using to subvert facial recognition (FR) systems, experts have warned.
As the Australian government considers a bill to share people’s facial data with agencies across the country, scientists in Europe are working to overcome so-called “obfuscation” and “impersonation” techniques that allow people to fly under the radar from unwanted surveillance.
Using make-up to appear old is among the simplest and easiest techniques, according to Sebastien Marcel from Switzerland’s Idiap Research Institute.
This is because it appears natural and realistic, can be applied with little artistic skill using readily available tutorials and, unlike other make-up techniques that are achieved through lipsticks, eyeliners and foundation materials, can create an appearance that was acceptable for all genders.
“Researchers demonstrated that make-up can be used to alter significantly the appearance to obfuscate the identity [evade recognition] and even to impersonate an identity,” Dr Marcel said.
“However, this last attack to impersonate someone with make-up is extremely complicated to put in place as you need a skilled make-up artist.”
Facial recognition backlash
Efforts to subvert FR technology have existed since the technology was commercialised two decades ago, including disguises, wigs, prosthetics and accessories like cardboard glasses, along with silicon masks of somebody else’s face and make-up that prevents a face being detected at all.
The CV Dazzle website offered patterns it said avoided detection from FR algorithms. Photo: CV Dazzle
Some techniques are advertised freely by various groups who believe surveillance without cause or notification is an infringement against basic freedoms and an individual’s right to privacy.
Germany’s Peng! Collective, for example, in 2018 invited people to morph a profile photo with another person’s to create a fake face for their passport photos.
Peng said it wanted to “empower” people with their own data, “flood the [FR] databases with misinformation”, and avoid “automatic detection”.
“Where previously strict rules applied to what may happen with our biometric data, under the pretext of fighting crime, the rules have now changed,” it said in a statement.
“Rich and powerful companies and states will also be able to access and manipulate this data.”
Hong Kong demonstrators in recent months have used masks, hoodies, sunglasses and umbrellas to avoid FR detection, vandalised lamp posts with FR capabilities, and pointed lasers at police cameras.
Hong Kong demonstrators wearing masks topple a post equipped with FR tech in August. Photo: ABC News
Aged make-up a ‘challenge’ for AI
FR systems typically work by mapping an individual’s facial features, such as eye sockets, nose, chin and jawbone, as well as skin texture, lines and blemishes.
The biometric measurements are captured as data that can then be distributed and used by authorities to recognise individuals in a crowd, such as was attempted unsuccessfully at the 2018 Commonwealth Games in Queensland, or while passing through regulated checkpoints.
Aged make-up works by changing the shape of the face, especially around the eyes and chin to replicate ageing’s sagging effect on the skin. It also adds highlights, shadows and wrinkles or emphasises existing wrinkles in a way that does not draw attention to a person.
“This is clearly a research challenge,” Dr Marcel said.
“A research direction is to investigate the use of multi-spectral images, combining visual spectra [what is visible to the naked eye] with others, such as near infrared, short-wave infrared or even ultraviolet A [light].”
But Vinod Chandran from Queensland University of Technology said systems using infrared images to check for heat and “liveness” to assist with face detection could themselves be tricked by a person using a localised infrared light emitter.
Some scientists, for example, have successfully experimented with button-sized LEDs hidden under the peak of a cap that wash a face with infrared light.
“Such attempts are most likely to result in an invalid transaction or error rather than a false identification, but for a subject trying to avoid a watchlist, this is a successful attempt at evading the system,” Professor Chandran said.
He said other research to detect obfuscation techniques included liveness detection from eye or lip movements in facial videos, “contextual information surrounding the presented face”, textural information in the image, and systems that required a subject to rotate their head.
A composite of a real face (top) verses a silicon mask (below) and how it is detected by different imaging. Photo:: Idiap
Identity matching bill ‘may limit human rights’
The Australian government is currently reviewing a bill that, if successful, would allow the Department of Home Affairs to draw biometrics from driver’s licence photos and share it across Commonwealth, state and territory agencies for identity matching.
It is also reviewing an amendment that would allow data from passport photographs to be used in the same way – a system that it also plans to extend to the private sector.
In its explanatory memoranda, the government admits the amendment “may limit human rights, particularly the right to privacy”, but considered it “reasonable, necessary and proportionate” to promote the safety and security of people.
The Australian Law Council (ALC) has already called for better safeguards and oversight on how the biometric data is accessed and used by local government and non-governmental organisations and wants more robust privacy protections in place.
“It is critical to ensure that the legislation which enables the use of this type of technology does not permit a creep toward broad social surveillance in Australia,” it said in a Parliamentary inquiry submission.
A Department of Home Affairs spokesperson claimed the private sector would only be able to check a person’s photo with an individual’s consent under the bill, and “only where this is permitted by applicable privacy laws”.
“The system will only provide [private] sector organisations with a ‘match/no match’ response [and] it won’t return a person’s photo or personal information.”
She said the private sector had been able to use the government’s Document Verification Service since 2004 and there were 900 users among the financial sector, telecommunication groups and other major employers.
Peng! Collective’s Mask.ID initiative included a photo booth that offered users a morphed image of themselves. Photo: YouTube
Inaccuracies questioned
The ALC also wanted an annual report that disclosed the number of false matches and data breaches that occurred as a result of the identity-matching system.
Earlier this year, Law Council of Australia president Arthur Moses pointed out a London police trial of FR technology generated 104 alerts, of which 102 were false.
The technology scanned CCTV footage from the Notting Hill carnival and Six Nations Rugby matches in London.
Massachusetts Institute of Technology research published last year found FR algorithms trained to match mostly white male faces subsequently had significant difficulties identifying women with darker skin.
Australian Border Force SmartGate passport control at international airport terminals, which already uses FR technology, was earlier this year blamed for an IT outage that created delays.
Professor Chandran said the majority of research papers into obfuscation and impersonation addressed mobile device-based platforms, as opposed to border control with fixed cameras and supervision.
“Weaknesses and vulnerabilities will continue to exist and be exploited, but these measures may make more access control applications reliable enough to be accepted in the next four or five years as mobile telephones move into 5G technology more widely,” he said.
The Home Affairs spokesperson said the Australian system would be designed in such a way that any match generated by technology would have to be reviewed by an operator “trained in facial examination” as well.
She added the face-matching services had been independently assessed and used the most commercially accurate algorithms available – based on testing by the US government’s National Institute of Standards and Technology (NIST).
One in 1000 fail rate for mugshots
In 2018, NIST evaluated FR test results from four face databases containing more than 30 million still photographs from 14 million individuals, with the primary dataset being law enforcement mugshots.
They were mostly profile shots and lower-quality webcam images.
Professor Chandran said the best system NIST found, from a dataset of 12 million individuals, was incorrect for every one in 1000 tests on average.
“Recognition rates have improved considerably, by about two orders of magnitude since 2010, arising from the use of deep learning neural networks,” he said.
But he added that such algorithms could perform worse in transactional systems where the test conditions were different.
“For example, the algorithm may have been trained on a very large dataset of predominantly Caucasian and African-American faces and may have poorer performance with Asian subjects,” Professor Chandran said.
–ABC