Facial recognition smart glasses could make public surveillance discreet and ubiquitous.
The glasses themselves are made by American company Vuzix, while Dubai-based firm NNTC is providing the facial recognition algorithms and packaging the final product.
From train stations and concert halls to sports stadiums and airports, facial recognition is slowly becoming the norm in public spaces. But new hardware formats like these facial recognition-enabled smart glasses could make the technology truly ubiquitous, able to be deployed by law enforcement and private security any time and any place.
The technology has been dubbed iFalcon Face Control Mobile by NNTC and goes on sale in May, with pricing on a per-project basis. The AR glasses have an 8-megapixel camera embedded in the frame which allows the wearer to scan faces in a crowd and compare with a database of 1 million images. Notifications about positive matches are sent to the glasses’ see-through display, embedded in the lens.
NNTC boasts that its facial recognition algorithms are in the top three for accuracy in the US government’s Face Recognition Vendor Test, able to detect up to 15 faces per frame per second, and capable of identifying an individual in less than a second. That being said, the performance of these algorithms always varies in the wild, and the fake video demo below definitely shouldn’t be seen as a reflection of real-world performance.
NNTC says it’s so far produced 50 pairs of facial recognition-enabled glasses, and that they are “currently being deployed into several security operations” in Abu Dhabi, the capital of the United Arab Emirates. The company says the glasses are only on sale to security and law enforcement.
This isn’t the first time we’ve seen facial recognition embedded in glasses. Police forces in China deployed similar tech last year, using the hardware at train stations to pick out suspects in a crowd. The technology was also used to keep blacklisted individuals like journalists, political dissidents, and human rights activists away from the annual gathering of China’s National People’s Congress, a pseudo-parliament with 3,000 delegates.
Although technology like this seems particularly futuristic or dystopian, it’s not functionally too dissimilar from what is already deployed in the US and other Western countries. Police in America can use imagery collected from body cameras and CCTV cameras to search for suspects using facial recognition software, while in the UK facial recognition cameras are deployed at events like soccer matches using specially equipped vans. However, the iFalcon Face Control glasses do streamline this entire procedure. Users can carry or wear a portable base station which connects to the glasses and stores a database of targets. This means they don’t need an internet connection for the software to function, giving them more mobility, while the notifications sent to the glasses’ built-in display frees up the wearer to interact with people or perform other duties.
In other words: technology like this means law enforcement agencies can adopt facial recognition algorithms and use them in public spaces with less hassle and fewer distractions. That means it’s likely to be used more widely. There are, of course, numerous privacy and civil rights concerns associated with facial recognition. The algorithms that power this technology are prone to bias and are often used by law enforcement in a slapdash manner. This can lead to false arrests and imprisonment and gives police officers a new tool to discriminate against ethnic minorities.
On a macro level, the spread of facial recognition technology across the globe means the concept of public anonymity will soon become antiquated. As has been seen in China with the government’s crackdown on the largely Muslim Uighurs minority, technology like this enables oppression and racial profiling on a massive scale. It will certainly be a boon to authoritarian governments and regimes. In its marketing materials for the iFalcon glasses, NNTC says the technology could be used for a range of tasks including “public surveillance,” “preventing terrorism,” and “monitoring immigrants.” It also says its algorithms can detect individuals’ age, gender, and emotions. (A scientifically dodgy claim. Although facial recognition systems can analyze emotion, it only does so in broad strokes and is far from reliable.) In a statement given to The Verge, the company said privacy concerns surrounding facial recognition is a “serious and sensitive topic.” However, the company argues that the technology is no different from “old school naked eye search when a photo of the suspect is published and security can spot him.”
“We at NNTC truly believe that any government surveillance activity should be conducted lawfully and under the public control,” said the company. “We understand the complexity of keeping a balance between security and safety of law-abiding citizens and human and civil rights and freedoms.” Meanwhile, cities and governments are just beginning to reckon with the implications of this technology, with many countries demanding better legislation and control. San Francisco has even gone so far as to ban the use of facial recognition, but the technology will continue to spread around the world, especially as companies package it up in increasingly compact and discreet ways.