Clearview AI Attempts to Overcome the Limitations of Facial Recognition
- Alexander Gary
The company says it’s trying to sharpen its tool. Not everyone is convinced its aim is true.
Clearview AI, one of the most widely discussed facial recognition companies to date, is attempting to strengthen its core product by casting a wide net. Last month, the organization announced that it had scraped a whopping ten billion face images taken from YouTube, Instagram, Facebook, and other social media platforms and added it to its own software that’s currently in use by more than 3,000 law enforcement agencies in the United States and 25 countries around the globe. This type of application is precisely the dystopian Minority Report-like future many of us fear.
Which is why the company may want to slow down and consider the issues other facial recognition companies have encountered, rather than live by Facebook’s infamous “move fast and break things” mantra — which, ironically, decided to shutter its facial recognition program on November 2nd.
Here’s the gist: By collecting more faces than the world’s population, the hope is that the agencies that rely on Clearview will have more of a chance to find a winning match when searching for a person of interest. Co-founder of Clearview AI, Hoan Ton-That, holds the notion that the “larger data set makes the company’s tool more accurate.”
As we all know, facial recognition is a technological system that takes recognizable human faces from images and videos, and, through data analysis, matches them with faces from other sources for the sake of identity verification and surveillance.
It’s not without serious flaws: On top of frequent concerns about privacy violations, the tech has also been known to contain a significant amount of error rates when it comes to recognizing people of color and women of any gender identity. This is a conflict that becomes especially dangerous in a law-and-order context, where human biases are already common and fosters real-life consequences for individuals and our society at large. Using more data to retrain algorithms may only serve to create more targets and accelerate the abuse of the technology by bad actors.
There’s also the problem with consent, which Clearview doesn’t bother to get. Without consent, information you’ve never voluntarily given to an authority can be weaponized against you before you even realize it. “If I provide my photograph to Google Photos, and it helps Google Photos send me reminders from last year, that’s one thing,” says Ani Kembhavi, a leader of PRIOR, the computer vision team at the Allen Institute for AI. “But if Google Photos now takes my photograph and sells it to a third company, or law enforcement, to say, ‘This is what this person looks like, or this person’s spouse or child looks like’…it’s an example of a company overstepping the line.”
Ton-That doesn’t appear too worried. In fact, he’s gotten more confident of the company’s prospects. Earlier this month, he told reporters that by using “deblurring” and “mask removal tools,” fuzzy images can be sharpened, and models can use guesswork to piece together a person’s face — even if that face is obscured by the coverings we now wear in public. The controversy surrounding facial recognition holds many social and political implications, but Ton-That said the company itself doesn’t work with a political bias. “There’s no left-wing or right-wing way to catch a criminal,” he said. (Ton-That has personally been linked to far-right groups and provocateurs such as Mike Cernovich, the conspiracy-peddling personality who has encouraged violence against women.
Clearview itself has worked very closely with law enforcement organizations by distributing its software among government and police agencies. As noted by a July GAO assessment of facial recognition tech, Clearview has claimed that its system is only intended to be used as an investigative tool by the police. That said, at least one news organization had gone over a list of the various agencies that have used Clearview’s software in the past, and some of the several hundred agencies that confirmed their usage were unaware that their employees had used the service to begin with. Others either denied the assertion or declined to speak.
Many are concerned. Last year, Clearview was asked by the very websites it pulls from to stop using their photos for its own database. Ton-That declined the request, claiming that using the photos falls under the company’s First Amendment rights. This approach marks a strong contrast to Facebook’s — which, despite having many of its own privacy-related issues, actually allows users to disable face recognition upon request. For Clearview, making an omelet requires breaking a few eggs.
Everybody is entitled to their right to privacy. But the responsibility to protect ourselves doesn’t entirely fall on us as individuals. “Better regulation should be set down by regulatory agencies,” adds Kembhavi. “Be careful of where you are uploading a photograph, with some understanding of the confidentiality agreements [you are entering]. If you’re a social media site, perhaps you should not be allowing others to scrape you, and if you’re a third-party company, you should not be able to use photographs on the internet without the consent of the users.”
Rather than hoovering up our photos, Clearview could instead use its resources to create software that improves society. The company has pulled in $30 million from unnamed investors to create a reliable identification software that is supposed to help the police. But they will have competition: a group of 53 investors who oversee $5 trillion in assets have committed to investing only in facial recognition companies who disclose the accuracy of their technology, the source(s) of their image databases, monitor for algorithmic biases (particularly with respect to race, gender, or age), and demonstrate “that effective grievance mechanisms are in place to enable victims to report consequences and to access remedies.” Surely this approach to facial recognition technology is one Clearview AI can get behind. One can hope.