LULU GARCIA-NAVARRO, HOST:
And you may have started it like me, by scrolling on your phone, checking your social media feeds - Twitter, Instagram, Facebook. We're going to start this hour talking about Big Tech. A series of damning reports based on internal documents has come out from The Wall Street Journal about Facebook and its other platform, Instagram, from Instagram causing low self-esteem in teens to anti-vaccine misinformation flooding Facebook comments. The journal writes, quote, "Facebook is acutely aware that the products and systems central to its business success routinely fail and do harm," and that the company, quote, "has often made minimal or ineffectual efforts to address the issues." And we should note here, Facebook is an NPR sponsor.
We're going to take a wider view now. Joining us are three men who've been thinking a lot about the impact of Big Tech in our lives. They are three Stanford professors. Mehran Sahami is one of the inventors of email spam-filtering technology - thank you to him - which he did while he was at Google. Rob Reich helped create the global movement, Giving Tuesday, and is a philosopher who is the associate director of the Institute for Human-Centered Artificial Intelligence. And Jeremy Weinstein first served in the White House under President Obama and now is a political scientist. Welcome to you all.
JEREMY WEINSTEIN: Thanks so much for having us.
ROB REICH: Thank you.
MEHRAN SAHAMI: Great to be here.
GARCIA-NAVARRO: They have written a book called "System Error: Where Big Tech Went Wrong And How We Can Reboot." And, Rob, I'm going to start with you and those Wall Street Journal reports - your reaction to what they show.
REICH: Yeah, these Wall Street Journal reports are just the latest blockbuster in investigative journalism about Facebook, showing that the company has known for quite some time through its own internal research that its very products - the Facebook newsfeed, Instagram's infinite scroll - are causing significant harms in a variety of different ways. And it just goes to show how the companies, in the wake of 2016 and the election of Trump and the clear Russian interference on the platforms, have tried to staff up with genuinely capable people. But these people evidently have very little power within the companies.
And the overriding concern at Facebook, in particular, is that there is exactly one decision-maker at the company, and that person's name is Mark Zuckerberg. He is the unelected governor - or a dictator, if you will - of the speech environment for nearly 4 billion people. And for me, this raises the question, at the end of the day, is Facebook ultimately something like Big Tobacco, that it can have lots of people who are working very hard to try to surface concerns and make various tweaks or tuning of the algorithm to make improvements to the platform, but fundamentally, the product itself is just basically broken? And absent external pressure, it won't be fixed.
GARCIA-NAVARRO: I mean, Mehran, the Wall Street Journal report shows that, actually, the flaws that we're seeing now is the platform working as it is designed. That's the case you make in the book. Take us back to what you view as the sort of original sin, if you will, of Big Tech - I mean, how we got here.
SAHAMI: You know, there is decisions going on in these companies that we don't get to see, and these decisions are driven by a lot of data-based metrics that the companies want to optimize. And that includes things like the amount of time people spend on platforms, the amount of engagement by clicking on particular pieces of content, the number of friend connections they make. And even, you know, as noted in some of these articles, when the harms come to the light of the executives, they continue to optimize these metrics because they're the things that are generating revenue. They're the things that are generating people's engagement with the platform. And so it becomes hard for them to break out of this mold because they're so driven by the metrics.
GARCIA-NAVARRO: I want to just jump in here because I need to mention that you three are not here together by chance. I mean, one's a political scientist, one's a philosopher and one is - deals with computers. You wrote this book based on a class you teach together at Stanford. And I want your origin story here, Jeremy, about why you three felt the need to talk about this as a group.
WEINSTEIN: So you can't tell the story of Stanford without telling the story of Silicon Valley, and vice versa, that Stanford has experienced a sort of ascension to the top echelons of the university system in the United States and the world as part - a function of its relationship with Silicon Valley. And the three of us came together in 2016 as the harms of Big Tech were increasingly becoming visible. We needed to create an environment in which our students were not just accepting at first blush the mission statements of Big Tech companies, that Big Tech is an unmitigated good for society, but thinking hard about disruption and what it means, thinking hard about the design of new technologies and what values are embedded in those technologies as they're designed.
And so we began to teach together, to create a class that was available not only to our CS students, which is the dominant major on campus, but also to other students in social sciences and the humanities because we believe that addressing the issues of Big Tech isn't only the responsibility of technologists. It's the responsibility of all of our students and, indeed, all of us as citizens. And with this book, "System Error," we're attempting to bring this framework to a much broader audience because our view is that democracy has got to get in this game, that ultimately we're not going to address the issues that The Wall Street Journal has covered this week without engaging in our politics.
GARCIA-NAVARRO: Mehran, I mean, following on, on what Jeremy and Rob said, I mean, basically, you're talking about outcomes that are divorced from programming. People are just thinking about the product and not necessarily what the impact will be. Can you give me an example of that? Because you have several in the book about people thinking up what could be great ideas that have unintended consequences.
SAHAMI: One example of that - amazon.com tried to build a system to screen resumes to try to determine who to give job interviews to a few years ago. What they found in building the system was that it actually had significant gender bias in it against women.
GARCIA-NAVARRO: Color me shocked.
SAHAMI: Yeah, exactly, right? And so the issue is that if such a system's just deployed out in the wild, it's going to be making these kinds of sexist decisions without people knowing. Luckily, at least in this situation, there was a team that went and audited the algorithm and determined that it was making these kinds of decisions. But then the really surprising part is they went and tried to fix it, and they couldn't actually eliminate all of the bias that was in the system.
Now, they ended up scrapping the system, which is the right thing to do in that case. But it really raises the point, first of all, that if such a technologically sophisticated company can't actually fix the problems, what systems do we not know about, right? And the fact that these computing, you know, algorithms are making much more important decisions in our lives - who gets credit, who gets released on bail in the criminal justice system, our dating lives through dating apps - and we don't have much transparency into what's going on.
GARCIA-NAVARRO: And meanwhile, these companies are amassing sort of huge amounts of information on us. I mean, you ask in the book why there's such an uproar over government surveillance and so little when it comes to the private sector. I mean, these companies know everything we do online, which means they know almost everything we do, full stop.
WEINSTEIN: This is Jeremy. You know, one of the things that is sort of unavoidable about the present moment is that we still operate in a Wild West of data collection on our personal lives with respect to Big Tech and social media platforms. And, you know, you have to contrast that with the approach that we've taken in the United States to protecting privacy when it comes to our personal health circumstances or our educational records for ourselves or our kids. What we've had in the case of Big Tech is no explicit effort to balance the value of privacy that we care about against the potential that Big Tech companies have to learn from our data in ways that enable them to dominate the advertising business and personalize their services. And we're at a critical pivot point where concerns about privacy because of the misuse of that power by the large tech companies have reached such a fever pitch that we're beginning to see regulatory momentum - obviously in Europe, more recently in California - that reasserts this right to privacy and puts the power back in the hands of users to make judgments about just what data that they want to share and how they want it to be used.
GARCIA-NAVARRO: I mean, Rob, the worry over Big Tech seems to be the one unifying thing in this very partisan world. But on the left and the right, there are very different concerns. So do you see both sides of this partisan divide unifying on this issue of Big Tech and on what exactly needs to be regulated?
REICH: This is, I think, one of the most important things to communicate about the problem as a whole. Silicon Valley is famously ahistorical, completely disinterested in learning from the past. And we are convinced, from having been present in Silicon Valley now for 20-plus years, that we are exiting a long era of principled regulatory indifference from Washington, D.C., and from other states and governance bodies across the world. We've lived through what basically is a 25-year, 30-year period of initial techno-optimism and then the past five, six, seven years of a backlash against Big Tech.
And so in all of that time, we've just relied upon the people - the small number of people and the unrepresentative people - basically, white guys working in Silicon Valley companies - to make all of these decisions for us. And the future, we think, is one in which we're going to have an array of different forces that make decisions about the digital revolution that aren't just in Silicon Valley tech companies. And that's the essential next step in this grand story.
GARCIA-NAVARRO: Mehran, I'm going to give you the last word on this. I mean, the idea that technology is in tension with democracy itself is something that is deeply frightening. As someone who is steeped in this culture, I mean, what is your prescription for what needs to happen next?
SAHAMI: Well, I think most of the tech sector would like you to believe that your only choices are, as an individual, that you can, you know, choose to use a particular app or not choose to use a particular app. There's actually things - more things that an individual can do themselves. I mean, you can set the privacy settings in the applications you use to control what information people get about you. You can select what kind of web browser you use or what its privacy settings are to control what information goes to websites about you. There's things like that you can do, but there's really this bigger issue that these are systematic problems, and they require systematic solutions. So we need democracy here. It's people getting involved in the system to get the societal outcomes we want.
GARCIA-NAVARRO: That's Mehran Sahami, Rob Reich and Jeremy Weinstein, the authors of "System Error: Where Big Tech Went Wrong And How We Can Reboot." Thank you all very much.
SAHAMI: Thanks so much for having us.
WEINSTEIN: It was great to be with you.
REICH: Thanks so much, Lulu.
(SOUNDBITE OF DEVIL BANDIT SONG, "HEAT WAVES") Transcript provided by NPR, Copyright NPR.