© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Big Tech Companies Are Struggling With How To Best Police Their Platforms

AUDIE CORNISH, HOST:

We're going to take a closer look now at how those companies left out of the White House summit - namely Facebook, Google and Twitter - treat far-right content. Joan Donovan is a leading researcher in this area. She is with the Harvard Kennedy School's Shorenstein Center. Welcome to the program.

JOAN DONOVAN: Thank you for having me.

CORNISH: So what are some of the criteria these tech platforms use in deciding the content that will stay up, the content that they'll take down or even banning a user?

DONOVAN: So one of the ways in which they assess content has a lot to do with explicit calls for violence or explicit condemnation of known marginalized groups. So they will do this either by watching the content and making a summation, or people who viewed the content may flag that content, which then puts it into their content moderation system, and then it's up to a content moderator to decide if the content stays up or gets pulled down.

In terms of banning users, that usually takes a whole set of different processes, where a number of pieces of content or video have been flagged at different times. And usually, users are given usually three different strikes before they are banned as a user, and each strike comes with it - they pull back some of their different content features. They'll, you know, stop them from being able to monetize or to use advertising, and they'll stop them from being able to livestream. And then, in the final instance, they will remove the account entirely.

CORNISH: Is it too early to give a report card, say, for Facebook, Twitter or Google in terms of how these strategies are working?

DONOVAN: I've been researching this for several years now, and it's only been in the last year that there's been acknowledgement from these corporations that there are significant problems with especially white supremacist content on their platforms.

It was a watershed moment in Internet history, as well as in American history, that what happened in the Unite the Right rally, where it was very easy to see how much of each platform had been leveraged in order to bring people out for that event. And as a result, platforms knew very concretely the role that their technology had played in organizing that hateful event and what eventually led to the death of two police officers and Heather Heyer.

CORNISH: Conservatives have argued that the strategies that these tech companies are using means that they are disproportionately targeted. President Trump has complained directly to the head of Twitter about this. Is there any truth to that?

DONOVAN: So when we talk about conservatives, we're not always talking explicitly about white supremacists, but we don't really actually know the extent to which these companies are targeting any specific subset of people. We know they're taking down millions of pieces of content over the course of a year, but we don't know specifically if there's any particular political bias in it.

What we do know is we have statistics that show that conservative media does very well on social media. Conservative media outlets are some of the top shares. And so we have a particular subset of people that are complaining about bias that we do not have evidence exists.

CORNISH: In the meantime, these social media platforms, these companies, what is their challenge going forward?

DONOVAN: These companies do need to come together on a set of rules and potentially enshrined in a kind of regulatory body that ensures that we get consistent content moderation across these platforms. What we know is, when a platform does tend to moderate a group, especially when Twitter started to moderate white supremacists on their platform, these white supremacists moved to other platforms. Ultimately, these corporations have to think about, well, what is the content moderation strategy? But also how do they enforce it across all of the platforms consistently so that you don't get this blowback effect?

CORNISH: That's Joan Donovan, director of the Technology and Social Change Research Project at Harvard Kennedy Shorenstein Center. Thank you for explaining it.

DONOVAN: Thank you.

(SOUNDBITE OF PINBACK'S "MICROTONIC WAVE") Transcript provided by NPR, Copyright NPR.