© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook: Firm Working With Trump Campaign Stole 50 Million Users' Data

MICHEL MARTIN, HOST:

Now to another explosive story that has potential implications for both politics and the law. Members of Congress are calling for an investigation into the data firm Cambridge Analytica after new reports from the New York Times and The Observer of London. According to these reports, the personal information belonging to some 50 million people was stolen and used to target them in ways they would not have seen and could not have known about. This according to a former insider named Christopher Wylie.

He says this information was used to influence voters in the 2016 U.S. election. Cambridge Analytica has ties to the Trump campaign. In 2016, the firm was hired to run the campaign's data operations, and Steve Bannon is a former vice president of the firm. Facebook says the data was obtained legally but misused afterward, but they've suspended Cambridge Analytica from their platform.

We wanted to understand how all this could have happened and what it might mean, so we've called Antonio Garcia Martinez. He is a Silicon Valley veteran and early Facebook employee who headed the company's targeting efforts, and he is now a contributor at Wired.

Antonio Garcia Martinez, thank you so much for speaking to us.

ANTIONIO GARCIA MARTINEZ: Thank you for having me.

MARTIN: Well, as a person who has worked closely with big data, could you just tell us your reaction to these reports?

MARTINEZ: Yeah. It's not totally surprising. I mean, the thing to keep clear here is that the data that's, kind of, leaked out isn't really, kind of, from the ad system but rather what's called platform-internally. And what that means is there are apps on Facebook, and, you know, you opt into these as a user, and that's, kind of, how this data breach happened. And it's really this weird thing where there is data coming out of one side of Facebook being used on the other, which is on - which is on the outside. But, you know, frankly, I wasn't hugely surprised. I mean, this is one of those things that it's been part of the platform for years now, and there's really not much Facebook can actually do about it is the hard truth about it.

MARTIN: You know, I was going to ask you about this because this is being described as a data breach, which says that the information was stolen, and that is the word that the Observer report uses. It says that the - this information was stolen or that Facebook basically left the door open and allowed this to happen and didn't put enough controls into place to keep this from happening. Now, Facebook says that they are investigating but that people did consent to give out their information, so this was not a data breach. How do you respond to that?

MARTINEZ: Yeah, no. I mean, how that term is typically used, this is not a data breach. That really gives the impression that there was some sort of security flaw and that Facebook somehow got hacked and literally everything from, you know, your messages to your partner to your, you know, childhood photos were potentially stolen, but that's not the case at all. You know, most of your listeners, probably, at some point, have opted into a Facebook app, and you do that even when, for example, you're using Facebook to log into a website or even on a mobile app. When you consent to a certain app accessing your Facebook data, that's kind of what you're opting into.

So this wasn't that somebody broke into Facebook and stole data. This is users who opted into an app, and then the data that that app got via this opt-in was misused in ways that are not in accordance with Facebook's data policy. So I mean, to the extent that we can sort of distinguish between what a breach implies, which is kind of a break in Facebook security law, and, you know, data exiting in a way that user consented to that was then, kind of, misused, it's definitely the latter situation right here.

MARTIN: So let me take this from another direction. One way to look at this is that the information may have allowed the Trump campaign to microtarget specific groups of voters, so how is this different from what all campaigns do these days? Or businesses for that matter? I mean, one of the key sources for this reporting, Christopher Wylie, was studying fashion trend forecasting, and he just applied the same kind of logic to this other enterprise. So is this different from what users could reasonably assume was being done with their data?

MARTINEZ: You know, that's a great question. I mean, the short answer is no. I mean, this is kind of a - a somewhat obscure field of advertising targeting called psychographics, and that's, kind of, the buzz word for it. And what it basically means is it tries to capture a person's psychological state, you know, down to things like openness or neuroticism, and these are words that, you know, were quoted from some of the leaked emails that the Guardian leaked out, right? So they try to profile a person according to a certain psychological dimension, and then understand how likely that person was to vote for or against some issue or candidate. Right? And what they were trying to get at was using Facebook to understand how, you know, neurotic, say, you were. And if that neuroticism, in their minds, would incline you to vote for or against somebody, they would try to target you as a result of that.

This isn't new. I mean, this sort of psychographic type stuff has been around even before the internet. It's been around for decades. It's also been on Facebook before. In the past, there have been other outside companies that have tried to psychologically profile users. What's strange is that it was - you know, that the data used to actually build these psychological models was kind of unethically arrived at. That, to me, is a critical issue. You know, the - how powerful this data was - I mean, to be honest, I think most ads professionals don't buy much into this whole psychographic thing.

MARTIN: So could you just drill down on something you just said, which was you said there was something unethical or potentially unethical about the way this - this information was arrived at or it was taken by the firm that eventually used it. So what was unethical about that?

MARTINEZ: Well, if the allegations are to be believed what happened was that Cambridge Analytica or, you know, their confederates and consultants paid test subjects to take a political poll, and in the course of taking that political poll, they had to opt in to a Facebook app as we often do when we log into other sites using Facebook, the difference being is that in this case, they took the results of those political polls - right? - their inclinations, their psychological profiles - and they joined that with all their Facebook data that they managed to suck out of Facebook via this Facebook app that the users had opted in to, not just those Facebook users but also their friends.

On Facebook's platform - not anymore but until not too long ago - if you opted into an app, that gave access, potentially, to the app to your friends data as well. And that's why the number of users whose information has been compromised is as high as it is - 50 million people. Cambridge Analytica didn't pay 50 million people take a poll. It paid of order tens of thousands, but each of those individuals had hundreds or thousands of friends, which is why the number is so large. And that's what's unethical about it. They basically paid people to take a poll, but then they basically hoovered out all of their Facebook data and joined it with that political data. That's the illegal bit.

MARTIN: That's Antonio Garcia Martinez. He's the former head of Facebook's effort to develop ad-targeting strategies. He's now a contributor to Wired.

Antonio Garcia Martinez, thank you so much for speaking with us.

MARTINEZ: It was great time. Thanks.

MARTIN: We should also mention that we reached out to Facebook for comment. They said they are, quote, "conducting an internal and external review," unquote, to make sure the data no longer exists. Transcript provided by NPR, Copyright NPR.