SCOTT DETROW, HOST:
Facebook CEO Mark Zuckerberg will be facing tough questions when he appears before Congress in the coming days. At the top of the list, the scandal involving Cambridge Analytica. That's the company that's been accused of improperly obtaining data from millions of Facebook users, then using that information for its work on political campaigns, reportedly including the Trump campaign.
Let's hear now from someone with a long history in Silicon Valley. Yonatan Zunger, the former Google engineer, recently wrote in The Boston Globe that this scandal is just more evidence that the entire tech industry faces an ethical crisis.
YONATAN ZUNGER: The method by which Cambridge Analytica got the data from Facebook was a system Facebook built almost specifically for the purpose of making it easy for companies to harvest information about networks of individuals.
DETROW: Right. And you write that over and over again throughout history, and also in recent years, in the tech field, companies work on something with a specific intention, and then the product is used with - only a slightly degree off from that intention in a way that nobody thought about and causes a lot of harm.
ZUNGER: Absolutely. Something I always tell people is that any idiot can build a system. Any amateur can make it perform. Professionals think about how a system will fail. It's very common for people to think about how a system will work if it's used the way they imagine it, but they don't think about how that system might work if it were used by a bad actor, or it could be used by just a perfectly ordinary person who's just a little different from what the person designing it is like.
DETROW: How do companies have those conversations like you mentioned about the downsides of the services they're coming up with?
ZUNGER: This is the single most important thing that most companies can be doing right now. First and foremost, companies need to pay attention. And, in fact, individuals working at these companies need to be thinking about how each product could actually be used in the real world.
If you build a product that works great for men and is going to lead to harassment of women, you have a problem. If you build a product that makes everyone's address books 5 percent more efficient and then gets three people killed because it happened to leak their personal information to their stalker, that's a problem.
What you need is a very diverse working group that can recognize a wide range of problems, that knows which questions to ask and has the support both inside the company and in the broader community to surface these issues and make sure that they're taken seriously and considered as genuine safety issues before a product is released to the public, as well as after.
This is different from a traditional compliance function, where they come in at the very end and say, no, I'm sorry. You can't launch this, at which point a business leader is just going to say, well, we need to launch it and it's too late to change it. Because they were in there from the room from day one, it makes a huge difference.
DETROW: A lot of people would say, especially here in Washington, where we are, that the answer could be federal regulation. Do you think that's the right way?
ZUNGER: I think regulation has a place, but it's important to handle it very carefully. In particular, everyone agrees that building codes are a great idea, and I think most people also agree that our elected representatives are not the right people to decide what kind of insulation is appropriate for use in the garage.
What you want is regulation and other mandatory mechanisms, like ethics standards or review boards or whatever processes you have, that specifies goals and objectives, which we can discuss as a society. And then the actual translation of that into implementation is something that should be done by people who deeply understand the field.
DETROW: Well, Yonatan Zunger, formerly of Google. Now he works for the tech company Humu. Thank you so much for joining us.
ZUNGER: Thank you very much. Transcript provided by NPR, Copyright NPR.