MICHEL MARTIN, HOST:
Do you ever find yourself groping for your keys or searching your house for your eyeglasses or wondering where your kid left her backpack? If so, you might have been thinking about Apple AirTags. Those are tiny tracking devices about the size of a quarter. They're being marketed as a way to help keep track of things like keys or kids' backpacks. But now there's growing concern that they're being used to track people without their knowledge. This past Wednesday, New York Attorney General Letitia James issued a consumer alert about these devices, warning New Yorkers to be aware of potentially malicious uses like stalking.
We wanted to learn more about this technology and the privacy concerns surrounding its use, so we've called Eva Galperin. She is the director of Cybersecurity at the Electronic Frontier Foundation. That's a nonprofit that works to defend civil liberties in the digital age. And she is with us now. Eva Galperin, welcome. Thank you so much for joining us.
EVA GALPERIN: Hi. Thanks for having me.
MARTIN: So before we jump in, could you just explain how these Apple AirTags work for people who might never have seen them? As I said that they're shaped like a coin, but what exactly do they do, and how do they work?
GALPERIN: It pairs over Bluetooth to your phone, and then you attach it to whatever item it is that you don't want to lose. When you have misplaced the item, you can go to your phone, and it will tell you where that item is located using Find My. The way in which AirTags are different from the other physical trackers is that the physical trackers usually depend on a network of other phones that have the app installed on the phone. And what Apple did was, essentially, they decided to use the entire network of devices with Find My installed on them, which is nearly every iPhone that exists.
MARTIN: So the idea is that this would be your device that you would use for yourself. And what I think I hear you saying is that because the way this product is designed, that you could apply an AirTag to somebody who is not you and then they would never know.
GALPERIN: You can. And this was a concern the moment the product came out. And in response to these concerns, Apple did include some anti-stalking mitigations. For example, if the AirTag was - when the AirTag first came out - out of range of the phone that it's paired to for 36 hours, it would start to emit a beep. That beep is about 60 decibels, which is about as loud as your dishwasher. And you still get, you know, 36 hours of free stalking, which seems like a little much. That's pretty invasive.
MARTIN: So Apple recently released a statement about AirTag and unwanted tracking. In that statement, they said that they have been, quote, "actively working with law enforcement on all AirTag-related requests," unquote. You've shared with us that there have been some improvements, but they're not - in your opinion, they're not enough. What else should they be doing, and can they do those things?
GALPERIN: Well, in December, Apple came out with an app that you can install on your Android that would allow you to know whether or not you were being tracked by an AirTag. But that app does not work the same way as the iPhone capabilities. You have to proactively download an app, and you have to proactively run a scan. And that is a much higher barrier to entry than just having everything running automatically in the background on your phone.
MARTIN: At its core, this is a privacy issue. And this certainly isn't the first time, as you just said, that privacy concerns were raised with the new technology. The conflict seems to often boil down to the fact that lawmakers are slow to regulate fast-developing technologies. Is there a way that you think policymakers should be thinking about addressing privacy before something bad happens, before something - because what I'm hearing you say is that this could have been anticipated, that somebody would - that people - that all technologies have positive benefits, and they all have malicious uses. So is there a way that they could think about this or that they should be thinking about these strategies before something terrible happens?
GALPERIN: Oh, absolutely. And I think that that - those are decisions that need to be made not necessarily on the legislative and policy level, but that should be being made inside of the company and that really need to come as a result of a change in the culture. I think that part of the reason why the AirTag came out the way that it did was because of a blind spot among Apple developers of trying to imagine a person who doesn't own Apple products. In the case of, you know, what should we be doing...
MARTIN: Can I just ask you one more thing, Eva? Excuse me. Could it also be that there's - that gender plays a role here...
GALPERIN: Oh, absolutely.
MARTIN: ...That perhaps developers did not occur to them that this would be a particular concern for women?
GALPERIN: I think that it did occur to them to include some anti-stalking mitigations, but I think that if there had been more women involved in this process that the anti-stalking mitigations would have been more robust and that concerns about stalking would have been front and center, rather than sort of a tacked-on afterthought to the initial product.
MARTIN: In the consumer alert, Attorney General Letitia James recommended that consumers listen for unfamiliar beeping and to watch for the Item Detected Near You notification on their iPhones. Are there any other steps that you would recommend that people could take to protect themselves and their things, you know, from unwanted tracking?
GALPERIN: Yes. For one thing, I wouldn't count on the beep. The beep is really easy to muffle or disable. But what I would do is, if I don't own an iPhone, I would download Apple's detection app for Android. And I would proactively run scans regularly if I was concerned about being followed by an AirTag.
MARTIN: Is there something that law enforcement could be doing about this?
GALPERIN: One of the big problems that we have now, not just with AirTags, but with software which is covertly installed on people's devices and then used for tracking, is that sometimes the police simply don't have the training. They don't know what they're looking at. They don't understand how the stalking works. And they will tell people, well, this requires a full forensic analysis that will require us to, you know, seize all of your devices. Or even worse, they will simply say, you're not being tracked. You're imagining things. They will gaslight the victim.
And so one of the things that I've been working on is I've been working with Senator (ph) Barbara Lee on a police training bill in the state of Maryland, and it's in the state Senate right now. And it proposes that police at the police academy should receive training on how tech-enabled stalking works and how to recognize it.
MARTIN: Oftentimes when people - when privacy advocates raise these things, a lot of sort of regular users think, oh, they're just being extra, and then everybody else catches up. Are there some things that you routinely do that you could recommend to us?
GALPERIN: The advice that works for me is not necessarily the advice that works for most ordinary people. I don't run around telling everybody that they need to be worried about everything all the time because that's a really good way to get everybody to just ignore your advice or to drive themselves crazy. I think that people need to have a clear-eyed view of what they're trying to protect and who they're trying to protect it from and to do only the steps that get them that protection because trying to protect everything from everyone all the time is just unfeasible and exhausting.
MARTIN: That's Eva Galperin, director of cybersecurity for the Electronic Frontier Foundation. Eva Galperin, thanks so much for being here and sharing this expertise with us.
GALPERIN: It's my pleasure. Transcript provided by NPR, Copyright NPR.