“Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life.” You might be surprised to find out that this quote is more than 130 years old, from a law review article co-authored by Louis Brandeis, and inspired by the invention of Kodak film. It appears in a new book “Your Face Belongs to Us,” by Kashmir Hill, a tech reporter for the NY Times. She chronicles the journey of digital facial recognition software, focusing on Clearview AI Inc. from scrappy startup to a powerful player in the field, and exposes their many missteps, failures, and successful inroads into becoming a potent law enforcement tool.
Clearview wasn’t the only tech firm to develop facial recognition software: Google, Facebook, Microsoft, IBM, Apple and Amazon all had various projects that they either developed internally or purchased (Google with Pittsburgh Pattern Recognition and Apple with Polar Rose for example). In either case, these projects were eventually stopped because they were afraid to deploy them, as Hill writes. Facebook, for example, had face recognition projects as early as 2010 “but could afford to bide its time until some other company broke through.” But Facebook didn’t delete the code but merely turned it off, leaving the door open for some future time when perhaps the technology would be more accepted.
She documents one of the biggest challenges: being able to identify people in various candid poses, with dim lighting, with poor resolution street surveillance cameras, and looking away from the ever-seeing lens. Another challenge is legal, with lawsuits coming at Clearview from literally all corners of the globe. Leading the charge is ACLU lawyer James Ferg-Cadima and the state of Illinois, which was an early adopter of biometric privacy.
Clearview has also brought many activists to protest and lobby for restrictions. One shared his opinion that “face recognition should be thought about in the same way we do about nuclear or biological weapons.” Clearview soon “became a punching bag for global privacy regulators,” she writes, and describes several efforts in Europe during the early 2020’s that resulted in various fines and restrictions placed on the company.
Police departments were early adopters of Clearview, thanks to today’s smartphone users that post everything about their lives. That has led to one series of legal challenges which was self-inflicted. Hill documents many cases where the wrong person was identified and then arrested, such as Robert Williams. “It wasn’t a simple matter of an algorithm making a mistake,” she writes. “It was a series of human beings making bad decisions, aided by fallible technology.” She wrote that one for a NY Times article entitled, “Wrongly Accused by an Algorithm.” In many of these wrongful arrest cases, the accused were black men, which could be tracked back to inadequate training data of non-white images. (Facebook had this problem for many years with its image recognition algorithm.)
Some of Clearview’s story is inextricably bound to Hill’s own investigations, where early on she tipped off the company about her interests and was initially blocked from learning more about their technology. Eventually, she would interview Clearview’s CEO Hoan Ton-That numerous times to connect the dots. “It was astonishing that Ton-That had gone from building banal Facebook apps to creating world-changing software,” she sums up his career.
The company was determined to “scrape” the web for personal photos, and today various sources claim they have accumulated more than 30 billion images. All of these images, as she points out, were collected without anyone’s explicit permission. This collection would become infamous and exemplify a world “in which people are prejudged based on choices they’ve made in the past, not their behavior in the present,” she wrote. You could say that on the internet, everyone knows you once were a dog.
She finds that Clearview created a “red list” which would remove certain VIPs from being tracked by its software by government edict. “Being unseen is a privilege.” Unfortunately, it is getting harder and harder to be unseen, because even if you petition Clearview to remote your images from their searches and from public web sources, they still have a copy buried deep within their database. Her book is an essential document about how this technology has evolved, and what we as citizens have to do to protect ourselves.