
Why Adams Wants to Expand Facial Recognition Tech While Other Cities Are Banning It

( AP Photo )
As part of his Blueprint to End Gun Violence, Mayor Adams wants to expand the use surveillance technologies, like facial recognition software, to track down potential suspects. Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.), practitioner-in-residence at N.Y.U Law School’s Information Law Institute, a fellow at Yale Law School’s Information Society Project, and Ashoka, talks about what the mayor has said, and the known pitfalls and controversies around such tech.
[music]
Brian Lehrer: Brian Lehrer on WNYC, #ReadTheBlueprint. That's what we're inviting you to do in The Brian Lehrer Show reading project. Read the blueprint. What's this? Well, Mayor Eric Adams has urged the public to not just talk about the most hot button items in his Blueprint to End Gun Violence, but to read the actual blueprint and its entirely, and as it happens, its entirety is only 15 pages of large font print and really just 13, if you take away the cover sheet and the table of contents. It's clearly sectioned with headings for the different proposals.
It's formatted like a press release so people like me in the media who read a lot of dense stuff every day can find it very organized and clear. I've read the blueprint and we're taking up the challenge in a bigger way and inviting you to as well. If you're interested, we've tweeted out a link to the Blueprint to End Gun Violence on our Twitter feed and we've posted the link on our webpage at brianlehrershow.org. We're inviting you to check it out for yourself. Tweet any comments you have, #ReadTheBlueprint. Then, join in a wide-ranging conversation we'll have about it on the show next week.
Again, we've tweeted out a link to the Blueprint to End Gun Violence on our Twitter feed and posted the link on our webpage, which is brianlehrershow.org. We're inviting you to check it out for yourself. Tweet a comment using the #ReadTheBlueprint, and then join in a wide-ranging conversation we'll have about it on the show next Thursday. Yesterday, I said Wednesday, we had to change it so it'll be on Thursday next week. Meanwhile, we're diving into some of the specific sections here on the show between now and then, and we'll have another one of those segments right now.
This time, it's on the very short section of the blueprint called using new technology to identify suspects and those carrying guns. In just two sentences, it says, "NYPD will explore the responsible use of new technologies and software to identify dangerous individuals and those carrying weapons. This technology will not be the sole means to make arrest, but as another tool, as part of larger case-building efforts." That's one of the blueprint. Here's Mayor Adams on this show last Friday. On a major part of that, he does not like metal detectors at schools.
Interesting that he was what some people might call it to the left of Mayor de Blasio on that, with the crime and gun shooting. The guns and shooting spike last year, de Blasio reinstituted some metal detector programs in high schools that had been taken down. Eric Adams thinks they're demeaning and create an atmosphere of suspicion rather than encouragement at school. He said this on last Friday's show.
Mayor Eric Adams: Technology must match public safety within the civil liberties of that we have. There is amazing technology out there right now that's not as intrusive, such as of the searching people when they walk in school. Some of those are really intimidating devices that you see. They're nondescript technologies that are available that would allow you to determine based on mass and other specifics.
Brian: Mayor Adams here on Friday. Interesting, but it's the other part of his technology proposal that's the most controversial; using facial recognition technology to identify people suspected of carrying illegal guns. Here's the mayor on that at a press briefing on January 24th.
Mayor Eric Adams: We will also move forward on using the latest in technology to identify problems, follow up on leads, and collect evidence from facial recognition technology to new tools that can spot those carrying weapons. We will use every available method to keep our people safe.
Brian: The mayor last month. Now, an article about this on Politico reminds us that there have been at least six lawsuits against the NYPD's use of facial recognition technology or ready pre-Eric Adams. There have been documented instances of it leading to wrongful arrest at least in other cities. A number of cities have banned its use altogether, New York has not. The NYPD says it's been effectively and legally used here to solve murders, rapes, and other crimes. Let's have that conversation.
Our guest for this is a critic of facial recognition technology who was quoted in the political piece and who's been on the show before, Albert Fox Cahn. Founder of the group, known as STOP, the Surveillance Technology Oversight Project. His bio page says he is also a practitioner-in-residence at the NYU Law School Information Law Institute and a fellow at the Yale Law School Information Society Project. That he started STOP, the Surveillance Technology Oversight Project with the belief that local surveillance is an unprecedented threat to public safety, equity, and democracy. Albert, thanks for coming on. Welcome back to WNYC.
Albert: Thank you for having me back, Brian.
Brian: In addition to the clip of Adams that we played, the mayor is also quoted saying last month, "If you're on Facebook, Instagram, Twitter, the police looking for illegal weapons suspects can see who you are without violating the rights of people." What's wrong with that in your opinion, if they're trying to get illegal guns off the street?
Albert: I think it's the same theme that we heard from Professor Butler in the last segment. We hear this promise that technology will give us a kinder, gentler digital stuff. They will give us a magical way to use facial recognition to find people with guns and to spare those who are innocent. The reality is, this technology is biased, it's error-prone, it's invasive.
When it does work, it can be used systematically target communities that have long been in the crosshairs of NYPD surveillance, whether it's political protestors, whether it's Muslim New Yorkers, or whether it's Black-led movements such as the facial recognition surveillance of BLM groups. This is just too powerful to be allowed to be used.
Brian: The political articles cited two cases from Detroit of the wrong people being arrested using the same software the NYPD uses. The NYPD says it has solved many rapes, murders, and other crimes. Is that the balance of risks and benefits as we know them? Many rapes and murders solved and just two wrongful arrests, or would you argue there've been more than that?
Albert: We can't have oversight by press release because we want to know what the data is. We've actually demanded under New York's public record law to know what is the NYPD's data on whether this tool works, whether it's effective, whether it is biased against Black and brown New Yorkers. They have said under oath, I am quoting verbatim, "The department does not have records relating to the accuracy and bias of the department's facial recognition."
They're saying under oath in court, they don't know if this works. They don't know if it's biased. They don't know if it's replicating the exact same abuses of stop and frisk, which is what many of the researchers believe. They're just going out there and pointing to these wild claims and then never actually providing the data to back it up.
Brian: Interesting. On bias, one of the limitations of the technology that's been reported is that it tends to be worse at recognizing one individual from another. If they're very old, very young, or have darker as opposed to lighter skin. In a racial justice context specifically, the technology is racist and reinforces human biases that already exist in people towards seeing darker skin folks as criminally suspect, but how much has this been documented, and can the technology not be improved in that respect?
Albert: There's been amazing work. I'm thinking of Joy Buolamwini, the founder of the Algorithmic Justice League, has done a great amount of work in documenting the systematic bias against Black women with this technology. Let's say we got this technology to work, even if it was completely technically unbiased. It's still being used by a biased and racist police force. We know that if you have a facial recognition match for a white individual, it will be treated with much more skepticism than if it's a Black suspect.
We see people constantly getting this extra layer of discretion, extra layer of deference, extra privilege every time they interact with law enforcement, if they are white. That will continue to happen with police technology like facial recognition. I think the work of Ruha Benjamin, the Princeton professor has been really exemplary in showing how even a technically neutral system replicates the bias of the community in which it's used.
Brian: Listeners, we can take your phone calls on facial recognition technology as a potential tool to help get guns off the street in New York City as part of Mayor Adams' Blueprint to End Gun Violence. 212-433-WNYC, 212-433-9692, or tweet @BrianLehrer. Anybody in law enforcement out there who's used it in any way in yourself, in investigation as a forensic tool, maybe there are people in the court system who have experience with it as evidence, lawyers, judges, anyone else or if you think you've been wrongly profiled by facial recognition software in any context?
212-433-WNYC, 212-433-9692. As we talk about this aspect of Mayor Adams'Blueprint to End Gun Violence with Albert Fox Conn, founder of the group, known as STOP, the Surveillance Technology Oversight Project. Albert, it looks to me like policy is all over the place on facial recognition, depending on where you are. The article on Politico says London is rolling out a very broad facial recognition surveillance program to catch criminals, which the article compares to China surveillance state, which we did a whole really scary segment on last week in conjunction with the beginning of the Olympics or the other day.
On the other end, cities including San Francisco and Seattle have banned its use for law enforcement altogether. I guess New York is somewhere in the middle with certain limit on its use, but it can be used. Can you give us an overview of the landscape of how it is being used today?
Albert: Of course, and I think that China is often held up as being this worst-case scenario of how the surveillance state can grow. The truth is that when you look at the number of cameras in New York, when you look at the system that are being used, New York looks a lot closer to Beijing than to a lot of cities around Europe with actual privacy protection. Here in New York--
Brian: But not to overstate unless you think it's not overstating, New York is not using it in the same oppressive ways as China is, or would you argue that it is?
Albert: No, I think clearly the use of facial recognition by the Chinese government to target the weaker community as part of that genocide is a unique crime against humanity. What we see is this technical capability that is truly staggering here in New York. We do see it used in ways at truly threatened democracy, such as the way it has been used to target Black Lives Matter protestors, the way it was used to target Derrick Ingram in a very prominent case back in 2020, where a SWAT team was sent to his door after he led a BLM protest because of facial recognition.
Here in the city, thanks to the work of Amnesty International, we know that the NYPD has more than 10,000 cameras in public places and that have been installed to track New Yorkers, and all of those feed into the city's Real Time Crime Center. On top of that, we have tens of thousands of private cameras, which are linked into the city through our domain awareness system. That has given the city this massive pool of video images to scour with facial recognition. Currently, the most recent year we have data for, it was more than 10,000 searches per year.
This could be used for anything. It could be used for graffiti, it could be used for shoplifting, it could be used for the most minor of crimes, and we see this really troubling pattern where historically, they haven't just taken an image as they see it on CCTV and run through facial recognition. They'll Photoshop them. They'll take that image. If the eyes are closed, they'll Photoshop them open. If the mouth is open, they'll Photoshop it closed. We'll even see entire sections of a face copy from Google images into a incomplete fit photo to create a composite.
This isn't an art project, this isn't science, and we have absolutely no data from any of the independent researchers on whether any of this, it is remotely accurate.
Brian: Few more minutes digging into this part of Mayor Adams' Blueprint to End Gun Violence. The technology to identify suspects and those carrying guns portion focusing on the controversial use that's already going on in New York of facial recognition technology. With the critic, this is a point of view segment. We're hearing a critic of facial recognition, Albert Fox Cahn, founder of the group, known as STOP, the Surveillance Technology Oversight Project. Let's take a phone call that I think might push back on him a little bit. Kate in Maplewood, you're WNYC. Hi, Kate.
Kate: Hi, thanks for taking my call. I am a Black woman. I'm very excited about Mayor Adams being in office and his background as a police officer. I think it gives us the opportunity to really view a good deal of work-- Hello?
Brian: We got you. Go ahead, Kate. We're listening.
Kate: Oh, I'm sorry. I think it gives us a great opportunity to get some criminals off the streets. Eric Adams, he is not a racist. When we have these problems, I think we tend to err on the side of protecting ourselves from racism. I think in this case, we could do well to focus on ending some of the criminality that exists. Then we could add into this equation protections that respond to circumstances where people have been wrongfully identified. We have to have all of the tools and to use all of the tools.
If we can fight racism at the same time, but let's not keep benefiting the criminals by saying, we can't even use this technology, we give more opportunities for people who keep killing, raping, and stealing to continue doing what they do. I think this is an opportunity to change
that.
Brian: Kate, thank you very much for your call. Of course, I'm going to get your response, Albert, but let me actually expand on what she said because I read part of a law journal article today on topic by Henry Perritt, Dean of the Chicago-Kent School of Law, who argues against blanket bands of the technology and says like what Kate was saying, from an individual citizen's perspective. This law school dean was saying, "Yes, it needs to be effectively regulated and its limitations in specific instances able to be challenged in court when it's used as evidence."
Like if it's reliable enough in a particular case to be used as identifying evidence, but that kind of management of the technology is possible and promotes the greatest good rather than banning it outright. Kate, the caller and this law school dean in a journal article is basically saying, "Let's not throw the whole thing out, but let's also be realistic about its limitations and its racial justice risks and regulate them."
Albert: I think if this were a trade-off between safety and privacy safety and civil rights, it would be a hard conversation. When you look at a lot of the claims that are being made by facial recognition vendors about what the technology can do, very few of them actually stand up to scrutiny. When it comes to this idea that we can actually regulate the technology-- Let's look at the experience with the NYPD. We've used this technology tens of thousands of times in New York City, but never gets actually reviewed by the courts.
The courts don't actually say, "Was this a proper use of facial recognition, or was it not?" That's because of the language that you read in the blueprint that the technology is not the sole means to make an arrest. That is something that the city has used for years to use facial recognition to get a match, to what they claim is match to then show that to an eyewitness and say, "Is this the guy?" That verbatim is what happens in some cases. Then, she says, "Oh, we arrested this person because of an eyewitness, not facial recognition."
What we've seen is a system that's really run rampant without any oversight, without any checks and balances, without any of the scrutiny you would expect in the legal system, and there's no evidence that we actually could set up that magical scenario that gets us a softer, gentler facial recognition program. I think that that's something which is often held out there as an alternative in the abstract, but we haven't seen any jurisdiction in the country, any jurisdiction in the world that has actually been able to do that.
Brian: All right, so that debate is engaged on that part of Mayor Adams' Blueprint to End Gun Violence. There's also a City Council Bill in process now that would regulate facial recognition technology in particular ways. We'll get into that with council members in the future segments, but before we end, Albert, I want to take one call on the other piece of Adams' proposed use of technology that we mentioned at the beginning and that is less intrusive scans to replace metal detectors at the entrance to schools, but still catch guns. I think Brian in Harlem is calling with a question about that. Brian, you're on WNYC. Thank you for calling in.
Speaker 2: Hi. Thank you, Brian. You have me?
Brian: Got you. You're a teacher, I see.
Speaker 2: I'm a former teacher, yes. I used to teach in the Bronx for about 20 years. I had a lot of students who had scars, all sorts of injuries, but especially box cutters are used to scar the face in particularly nasty encounters. There's even a no song about it, I believe. I think most of my students, even though it is demeaning to go through the metal detectors, realize there's a benefit to it. They're very small things that can be weapons.
One of my students were really-- it felt that school was the safe place, but I used to have students who I would like to say, "You have to meet me after school." "I can't do that," he said. "Look, you have to, I said." "No, I don't have to, I'm going home. I'm going home with a group of friends. I always home to, so we don't get jumped."
Brian: Yes. You saying that metal detectors did more to make your students feel safe at school than to make them feel intruded upon is your memory of that?
Speaker 2: I'm saying there are two sides to it. I'm not going to say that definitely-- Certainly, I watched my students, they're taking off their belts, they're doing this, but we do this when we go on an airplane. Of course, we don't go on an airplane every day. It is a safety measure and it's not unreasonable safety measure. I don't necessarily-- maybe the new technology could see box cutters.
Brian: That's a good question. Brian, I'm going to leave it there. I really appreciate your call. We are almost out of time, Albert, and I don't know if this aspect of the blueprint is in your Baileywick or just the facial recognition technology. Do you have a thought? Because I don't know what the new technology is in this case, but Eric Adams does not like metal detectors. He thinks metal detectors do create prison-type atmosphere in school, which he does think discriminates against students of color and creates an atmosphere that white students don't have to put up with in most places.
I don't know what this alternative technology is or what the policy or civil liberties arguments for and against would be, if there are any.
Albert: Yes, I think this gets to one of the real issues with a lot of the technologies being proposed, whether it's this or whether it's gun detection cameras. We don't have any of the details on how this technology would supposedly work. When we look at a lot of the research that's been done in the field, there're vendors out there that are willing to make so many claims. If we hand them millions of dollars, they promise to give us technology that'll solve all of this.
When you look at how these scanning devices fail at airports in large percentages of cases, when you look at a lot of the ways that these detection technologies can be circumvented, I'm really skeptical that there is any technology on the planet that can actually work as well as what Eric Adams was described.
Brian: Albert Fox Cahn, founder of the group, known as STOP, the Surveillance Technology Oversight Project. He's also a practitioner-in-residence at NYU Law School's Information Law Institute, a fellow at Yale Law School's Information Society Project. Albert, we always appreciate when you come on. Thank you very much.
Albert: Thank you again, Brian.
Copyright © 2022 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.