Yesterday, the Georgetown Law Center on Privacy and Technology released a report entitled “The Perpetual Line-Up: Unregulated Police Face Recognition in America.” The report is based on a “year-long investigation and over 100 records requests to police departments around the country,” making it “the most comprehensive survey to date of law enforcement face recognition” to date.
In short, the study found that more than 117 million American adults are included in law enforcement recognition networks; this means at least 50% of American adults are in police facial recognition systems. These facial recognition systems are populated by photographs on driver’s licenses, standard surveillance cameras and live-feed surveillance feeds. Because of how the systems are populated, the vast majority of the individuals included in the facial-recognition systems have not been convicted of — or even arrested on suspicion of committing — a crime. Making this even scarier is that the use of these police facial recognition systems is unregulated, is largely happening in secret, and is not subject to quality or accuracy control checks or standards.
So why is this an issue worthy of feminist concern?
First, even the FBI’s expert on facial recognition software acknowledges that surveillance systems like the ones being used in law enforcement across the country right now are least accurate for women, people of color, and young adults (18-30). If the reverse were true, if older white men were the ones more likely to get caught up in errors in the system, do we really have to ask whether law enforcement around the country would be so enthusiastically picking up and utilizing this new technology?
Moreover, mix those things together, and young women of color are most likely to be misidentified and brought into contact with the police for no reason. We can’t really claim to be trying to shatter ceilings if we look the other way when our sisters are shackled in place. Intersectional feminism is feminism.
Second, police facial recognition systems allow law enforcement to track and trace you, without a warrant, whenever they see fit. The report details how this can be particularly troubling when people are exercising their constitutional right to do things like engage in peaceful protest. Law enforcement agencies can use facial recognition systems to generate rosters of who has come out to protest on behalf of Black Lives Matter or Planned Parenthood. That’s scary now, but imagine how much scarier that gets if/when the political winds shift to favor those happily tweeting #repealthe19th?
Third, at a meta level, a right to privacy has been a cornerstone of decisions like Griswold v. Connecticut (birth control), Roe v. Wade (abortion), Loving v Virginia (interracial relationships), Lawrence v Texas (consensual homosexual sex), and Obergefell v Hodges (gay marriage). In other words, a right to privacy has been the cornerstone of many of the Supreme Court decisions that have recognized our autonomy to direct our own lives.
When we start eroding what it is to be a private citizen and the protections that surround that role, we may be embarking down a path that none of us want. While Aldous Huxley’s famous Brave New World dealt with these concepts to some extent, its predecessor — Yevgeny Zamyatin’s We — along with George Orwell’s 1984 explored in far more explicit detail what mass surveillance and a loss of privacy, particularly to law enforcement, could mean. None of them paint a rosy picture (but you should read them anyway. SO GOOD).
What Should We Do?
You should definitely read the report. It has its own easy-to-navigate, interactive webpage, complete with handy visuals. At a minimum, check out the Executive Summary and the Recommendations. You’ll find that the recommendations are straight-forward and easy for folks who have no specialized technology knowledge (what is the twitters?) to understand. For example, the recommendations are things like: police should only be able to use facial recognition searches in ways that the state legislature or a court has approved.
And then, ideally, raise the profile of these issues not just with your friends but also with your elected leaders — especially if you live in a jurisdiction that is (mis)using facial recognition systems! Write an op-ed! Reach out to the police and any oversight boards that might exist.
Whatever you do, don’t dismiss police facial recognition systems as not a feminist issue. We need to make sure, as technology becomes an increasingly important part of modern policing, that its implementation takes into account the rights and needs of people other than the old white men for whom policy is so often designed.