MICHEL MARTIN, HOST:
When Apple announced the new iPhone can use facial recognition technology to unlock the device, the response may not have been what Apple had hoped for. The feature immediately raised privacy and security concerns. To hear more about that, we're joined now by Clare Garvie. She's an associate at the Center on Privacy and Technology at Georgetown Law Center and co-author of "The Perpetual Line Up: Unregulated Police Face Recognition In America." She's with us now in our studios in Washington, D.C. Clare Garvie, thanks so much for joining us.
CLARE GARVIE: Thank you for having me on.
MARTIN: So lay out the privacy and security concerns for us. It sounds - I mean, the technology, first, if you think about it, sounds really cool. So what's the concern?
GARVIE: That's right. The technology is both convenient and it's really cool. And, frankly, I don't see too many privacy and security concerns with the way Apple has chosen to deploy face recognition. What I'm far more concerned about is as face recognition becomes normalized, as it becomes something that we use on an hour to hour basis to send an animated emoji, to check the weather, to send a text, what's going to happen is we get very comfortable with it. And we forget that it's used by any number of actors in ways we may not know about that is both less accurate and more privacy concerning than the way that Apple has chosen to use it.
MARTIN: Well, give us the worst-case scenario. Give us some scenarios that would cause concern.
GARVIE: So right now happening in Russia, face recognition has been used to scan anti-government or anti-corruption protests, identify and then publicly name the people at those anti-government protests. What this means is these people will be subject to intimidation, if not arrest, for their political beliefs. Now, before someone says, well, wait, that's Russia. Why should we in the U.S. care about that?
The fact remains in the U.S., it's very much a rules-free zone when it comes to face recognition. Law enforcement across the country use this technology in various ways without any laws governing its use. Evidence suggests that it was used on protesters after the death of Freddie Gray in police custody. It looks like face recognition was used on social media posts that protesters were posting from demonstration sites.
So the law enforcement agents on the ground could, in almost real time, get the identities, the names of the people at those protests. We're a country where we do not necessarily need to show our papers every time we walk down the street. If law enforcement demands our identity, we don't necessarily need to give it. And yet, our faces - now, something we have to present in public - have now done that work for us.
MARTIN: Sounds to me that your concern isn't so much this particular technology but that - what? - that it opens the door to a broader use? Is that really Apple's fault or responsibility?
GARVIE: I don't believe it's Apple's fault. And I think Apple has thought very, very carefully about a number of the security concerns. They have chosen to store the face template, if you will, locally on the phone, which means that it's a lot more secure against being hacked and being stolen. The real concern is that, as face recognition becomes normalized, we may stop worrying about the very real concerns that we should be worrying about as we increasingly are subjected to face recognition that we can't opt out of.
MARTIN: That's Clare Garvie. She is an associate at the Center on Privacy and Technology at Georgetown University's Law Center. She was kind enough to join us at our studios in Washington, D.C. Clare Garvie, thanks so much for speaking with us.
GARVIE: Thanks for having me on. Transcript provided by NPR, Copyright NPR.