ARI SHAPIRO, HOST:
Twitter users may start noticing blue exclamation marks on posts this week. It's part of an attempt to stop the spread of misleading and possibly harmful content. So how exactly does Twitter define what qualifies? I put that question to Twitter's Yoel Roth.
YOEL ROTH: We're looking for evidence that the video or image or audio have been significantly altered in a way that changes their meaning, and we don't really care about the technology that was used to accomplish that. Now, in the event that we find evidence that the media was significantly modified, the next question we ask ourselves is, is it being shared on Twitter in a way that is deceptive or misleading?
SHAPIRO: So let's imagine there's a video out there of me endorsing a presidential candidate, which I haven't done, and this is clearly manipulated or synthesized. What happens when somebody goes to retweet or reply to that tweet?
ROTH: So there's two possible scenarios here. The first one is, in situations where the media has been modified and we believe that it is actively harmful - so it could put somebody in harm's way or it could interfere with somebody's ability to exercise their fundamental rights, like participating in an election or speaking freely - in those situations, we would require the content to be removed from Twitter. We think that's generally unlikely to happen. In the majority of circumstances, like the one that you suggested, we would apply a label to that tweet. The label would indicate clearly that the video, wherever it appears, has been manipulated. And then if you tap on that label, it'll take you into an experience that provides additional expert context about that tweet.
SHAPIRO: I'm curious who draws the line about what gets flagged as manipulated media. I mean, the audio that NPR listeners are hearing right now is manipulated. You and I have pretaped this interview. A very talented producer has taken out our stumbles and umms (ph) and uhs (ph) and dropped the Q&As that didn't seem as important. Would a tweet with this interview get flagged as manipulated audio, which, by definition, it is?
ROTH: We definitely don't want to end up in a circumstance where ordinary and expected practices of editing get flagged as manipulated media. We're instead looking for manipulation that changes the meaning of content and that people are likely to be confused or deceived by.
SHAPIRO: You know, in January, NPR did a poll with "PBS NewsHour" and Marist. And it found that 82% of Americans think they will read misleading information on social media this election, and three-quarters said they don't trust tech companies to prevent their platforms from being misused for election interference. What do you say to those people, a majority of people in the United States?
ROTH: We've made really significant investments since 2018 in protecting the integrity of the Twitter platform and building product experiences that help people find credible information. But I think the challenge there is there's a lot of people engaged in a lot of different conversations who may be real, authentic people sharing things that aren't true. And the challenge for us as a technology company is how to create a product that helps you find accurate, credible information while also ensuring that people do have the ability to express themselves freely on issues that they care about.
SHAPIRO: Sounds like there is an implicit admission there that this is not a problem that will ever go away - just maybe a problem you can give people tools to help fight back against a little bit.
ROTH: The ability for people to dispute and talk about issues of the day in public is a really important function for a platform like Twitter to play. I think the responsibility of a technology company is to create a product that is free from attempts of people to put their thumb on the scale and shift conversations in a way that gives false or misleading information, unearned reach while ensuring that people still have the ability to participate authentically in a conversation about the issues that they care about.
SHAPIRO: Yoel Roth is Twitter's head of site integrity.
Thank you for the conversation.
ROTH: Thank you. Transcript provided by NPR, Copyright NPR.