AILSA CHANG, HOST:
With everyone hunkered down, it is boom times for Facebook, which, we should note, is among NPR's financial supporters. More people are relying on Facebook for updates on the coronavirus. And the company has made some changes so that when people go to Facebook or Instagram, they first see information from the World Health Organization or other credible sources. The harder challenge, though, is stopping the spread of misleading and harmful posts. I spoke about that with Facebook's vice president of global affairs and communications, Nick Clegg.
NICK CLEGG: We do not allow misinformation to circulate on Facebook, which can lead to real-world harm. So if people say drinking bleach is going to help you vaccinate yourself against coronavirus, that is dangerous. It leads to real-world harm. We will not allow that to happen. We won't even allow folk to say social distancing makes no difference in dealing with this pandemic. The thing...
CHANG: But what's the line? I mean, what is real-world harm? How would you define that? Isn't anything that's false considered harmful to some extent?
CLEGG: No. I think things that lead to people being more likely to make themselves vulnerable to and susceptible to catching the virus, that is the real-world harm.
CHANG: OK, so it's, like, a physical harm that you're talking about.
CLEGG: Yeah, it's a physical harm. But to be absolutely clear, for other forms of information which are misleading or dubious but don't necessarily lead - don't have that knock-on effect of physical harm, we still act on it. We will then refer conspiracy theories, for instance, about the origin of the virus to a network of 55 fact-checking partners that we have. If they then find that the claim is false, we will then downgrade it massively. So it's much, much harder to find on your news feed, firstly. Secondly, we'll put a sort of filter on the screen if you do find it, saying this has been shown by an independent fact-checker to be of, you know, dubious accuracy. And we will link people to more authoritative information about it. And that will apply to people who are trying to share that post or have shared it in the past.
CHANG: But it does seem like this could lead to a lot of murky, gray-area decisions. I mean, I'm imagining if your criteria for removal is simply real-world harm - that is physical harm - what do you do about xenophobic posts that, you know, encourage racism against Chinese people, for example? Because people are saying that, you know, that this is the Chinese virus, it's a Chinese disease, racism could eventually lead to real, physical harm against Chinese people.
CLEGG: We have long-established policies that any incitement of violence and hatred is not allowed on our platform. On the other hand...
CHANG: That would be taken down?
CLEGG: It's - I'm afraid I'm going to be very reluctant to get into hypothetical examples because as your question rightly identified, of course there is always a fine line - always - between the liberty to say often deeply objectionable things - that's one of the defining qualities of living in a free society. The line between that - and you're quite right to push on this - and material that should be removed, of course is a...
CHANG: Let me ask you - it sounds like Facebook is doing so much to stop the spread of fake news regarding the coronavirus, but your company took a lot of criticism after the 2016 election for not doing enough to stop misleading political information. So why can't Facebook be just as proactive with political posts?
CLEGG: First, what Facebook - and it wasn't just Facebook, of course - the world as a whole - no one really anticipated exactly in that moment that there would have been such a concerted attempt by external forces...
CHANG: Right, that was then. But given the lessons that we have learned since, would Facebook be more proactive during this 2020 election towards political posts, using the same rigor that Facebook is using to weed out misleading information with regard to the coronavirus?
CLEGG: We do, of course, have some limits. You cannot use your freedom as a politician in the United States, for instance, to say things which will lead to real-world harm. But beyond that, in a democracy with an independent press and with the claims and counterclaims that politicians make about each other, we think it's very important that private companies should allow voters for themselves to make their own judgments about what politicians are saying, about the future of their country, whereas when it comes to a medical pandemic, as I say, underpinned by science and by authoritative institutions such as the CDC and the WHO and others, it's, of course, much easier for us to act under the strict expertise and guidance from those institutions themselves.
CHANG: Facebook vice president of global affairs and communications Nick Clegg joined us via Skype.
Thank you very much.
CLEGG: Thank you. Transcript provided by NPR, Copyright NPR.