© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Assessing Social Media's Growing Clout In The News Landscape

MICHEL MARTIN, HOST:

If you are on Facebook, as close to three-quarters of Americans are, according to the Pew Research Center, it probably wasn't news to you that a lot of users get news from the site. So that's why it was such a big deal this past week when Facebook was accused of editorial bias, specifically squashing conservative opinion in selecting stories for its trending topics section. Facebook denied that accusation, but also said it is investigating and will address anything that is, quote, "against our principles," unquote. All this prompted us to want to think a bit more deeply about what this means about news in the age of social media, so we've gathered a few people together who've been thinking about this. David Folkenflik is NPR's media correspondent, Issie Lapowsky is a staff writer at Wired magazine and Ethan Zuckerman is the director of the Center for Civic Media Lab at MIT. Welcome to everybody. Thank you all so much for joining us.

DAVID FOLKENFLIK, BYLINE: Great to join you.

ETHAN ZUCKERMAN: Thanks for having us.

ISSIE LAPOWSKY: Thanks for having us.

MARTIN: So, David, I'm going to start with you just to give us the bottom line here. And as I mentioned - just based on what we know now, was there an effort to suppress certain opinions and to elevate others? And did that fall along ideological lines?

FOLKENFLIK: Well, there's a couple things to unpack real quickly. If you think about Facebook, the magic of it is it says we are not actually a news organization. But we are a source for news because we're a social media platform through which so much information comes coursing to you through the newsfeed, based on a magic algorithm that helps determine what you and people you like are interested in. They have these trending topics, and that's where this week's controversy has really fallen, that the algorithm tells you stories that are happening in the world, developments you may not yet know about. Well, instead of being an algorithm simply - that is, a computer formula - it turns out that these journalists, who were hired often on a contract basis, had guidelines about what sources they were supposed to trust more than others, when they could determine a story was credible enough. But there's no sense that from above there came an edict that conservative news had to be suppressed. What does seem to have been alleged was that individual editors made their own idiosyncratic choices and they tended to suppress news that might have been of interest to conservatives and elevate news that might not have been a part of conservative discourse.

MARTIN: Interesting.

FOLKENFLIK: And that's where the distrust comes. And I think it's worth pointing out that for decades, the liberal media has been a rallying cry for conservatives. Well, this week you heard that erupt about social media as well into full force.

MARTIN: All right, let's talk a little bit more about that. Hold that thought for a minute. Issie, I want to go with you because other social media companies, like Twitter and Snapchat, curate content, too - like, Twitter has the moments and while you were away. So is curating a growing trend in social media, and why is that? What do the companies say they're trying to accomplish with this?

LAPOWSKY: Yeah, curating has definitely become a trend because these companies are news sources, especially for younger Americans and people around the world. In order to do that, you can't just trust the algorithm. The algorithm is going to spit out any number of viral news stories that might not necessarily be the important news of the day. So you have to have some sense of news judgment. The important thing that Twitter did when it started moments was that it really announced this as a human-led initiative, that they were hiring a whole team who was going to take the best of Twitter...

MARTIN: ...But best is the question, right? What's best...

LAPOWSKY: Of course.

MARTIN: ...Is the question, right?

LAPOWSKY: Right, of course. But I think in telling people that this was going to be selected by a team of editors and by a team of journalists, they sort of covered their tracks there. Facebook didn't really do that with the trending topics. It led people to believe that this was neutrally surfaced information direct from its algorithms when in fact, that wasn't the case.

MARTIN: Ethan, can I just get your take on this?

ZUCKERMAN: Sure. I think what's going on is that we assume that trending topics on Facebook or locally trending on Twitter is simply automatic and that there's no politics, there's no decision-making that goes into it. There's really two levels of decision-making that goes into it. First, Facebook has, explicitly, an editorial team. But even when we just have an algorithm at work, there's still human judgment going into it. We like to believe that the machine takes care of all the work, but of course, those algorithms are written by human beings. They're making their own judgments. What I want to suggest is that not only is that a transparency problem, but we don't understand these algorithms very well, and these algorithms tend to have commercial biases. Those commercial biases tend to lead us towards news we want to share, news that makes us feel good, news that confirms our view of the world. There's a sense in which these algorithms may be doing things that aren't particularly good for us, though they are good for the bottom lines of these businesses. So I don't think it's just transparency. I think we even have to interrogate these algorithms and ask questions about what's the news they're prioritizing for us?

MARTIN: David?

FOLKENFLIK: Let me amplify a couple of points there. I do think that transparency is important. I think that part of the transparency is figuring out what Facebook is. Is it a useful site where we socialize? Is it an honest broker that is convening all points of view? You know, part of what Facebook appears to have been doing was in response to claims that the newsfeed was not surfacing news that was important. If you think in 2014, that was a time when there were a lot of videos about the ice bucket challenge, and Black Lives Matter and the protests in Ferguson, Mo. were being swamped by all these hundreds, millions of people across the country doing the ALS challenge. Facebook decided it needed a way to draw attention to developments and it needed to do so with a certain kind of behind-the-scenes journalistic sensibility. I think transparency is going to be important, and that is not - let me just be clear, that is not a value that Silicon Valley embraces.

MARTIN: Ethan, maybe you want to jump in on this next? Because I'm guessing that some people are listening to this conversation and are thinking, you people are - give me a break. I mean, who didn't know that people have editorial judgments and that they bring their own point of view to bear on this and that a lot of people think that the legacy media is so polarized - like, what's the difference? For some people, this is kind of precious, right?

ZUCKERMAN: So as soon as we bring human judgment into the mix, suddenly Facebook is playing something much closer to a journalistic role. It's taking responsibility for agenda-setting. It's taking that same responsibility that a front-page editor in a newspaper takes in saying here's the news that's important, that we have to pay attention to. I think people think these platforms are neutral, and the truth is they're very far from neutral.

MARTIN: Issie, I'm going to ask you to engage on this because you cover the tech industry. Do the people that you talk to - have they even entertained these questions that Ethan is raising here? I mean, do they see that they are stepping into a role that perhaps other people don't see them as playing?

LAPOWSKY: I think at this point, Facebook is very aware that it needs to take more responsibility. They started by publishing their editorial guidelines, just like any other news outlet does. So I think that was a big step. And I think it's important to note that despite the fact that there is editorial judgment, they are still working with substantially more data about the most important issues of the day or the week or the month or the year around the world that any other journalistic institution. So I think we have to give them a little bit of credit for that, that they have a certain bar for when a story can be part of trending topics. The fact that there's an editorial judgment exerted on it at the end - yes, we need to discuss that. Yes, they need to be more transparent about it. But they're still working with a huge, huge amount of information about what those issues should be.

MARTIN: David, give us a final thought here about where you think this is going...

FOLKENFLIK: ...Well, look, there's that famous saying - right? - that says freedom of the press is available to he or she who owns it. What I think is important is that we are starting to see a conversation over how we're going to consider these social media platforms. In a sense, they've been offered as public utilities. When you see the rhetoric that Mark Zuckerberg offers, he really wants this to be convening place, a place where people can talk across ideologies, across borders. And if it's going to be a public utility, who gets to set the rules of the road? If people begin to think that there's a hidden hand rather than an honest broker determining what information they get, I think you're going to see more calls for regulation, even though that cuts against American notions of freedom of expression and freedom of speech, even for social media platforms.

MARTIN: You've all given us quite a bit to think about here. That's NPR media correspondent David Folkenflik. Also with us - Ethan Zuckerman of MIT's Civic Media Lab and Wired's Issie Lapowsky. Thank you all so much for speaking with us.

FOLKENFLIK: You bet.

LAPOWSKY: Thank you.

ZUCKERMAN: Thanks for having us. Transcript provided by NPR, Copyright NPR.