© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

YouTube To Stop Promoting Videos That Spread Misinformation

AUDIE CORNISH, HOST:

Watch a video on YouTube, and you'll see a list of recommendations for what to watch next based on what you're watching at the moment and your search history. But it doesn't take much to go from a video that's fairly innocuous to one that promotes conspiracy theories.

That happens frequently enough that YouTube has come under pressure to change its algorithm. It says it will now promote fewer videos of what it calls borderline content. NPR's Andrew Limbong has more.

ANDREW LIMBONG, BYLINE: In a blog post, YouTube defines borderline content as things that, quote, "misinform users in harmful ways" but don't quite violate their community guidelines. The company specifically cites flat Earth conspiracies...

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #1: We do not believe that we're flying in space whatsoever. We don't believe the Earth moves at all.

LIMBONG: ...Phony miracle cures and 9/11 truth videos.

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #2: Western civilization is doomed unless we face the unanswered questions of 9/11.

LIMBONG: There are plenty of other misinformation videos on YouTube, from anti-vaccine rants to conspiracies of school shootings being faked.

(SOUNDBITE OF ARCHIVED RECORDING)

JOHN BOUCHELL: In my utterly qualified, expert opinion, there are several troubling facts being dispensed that I refuse to accept.

LIMBONG: These misinformation and conspiracy videos will still all exist on YouTube. They just won't be recommended to you. You'll have to look for them.

Google, which owns YouTube, declined to offer anyone up for an interview, but the company says it will, quote, "work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations."

Zeynep Tufekci is an associate professor at the University of North Carolina studying the social impacts of digital technology and artificial intelligence. She says the big problem with the YouTube recommendation machine is that it's designed to get you to spend as much time on the platform as possible so they can sell more ads. The accuracy of the content doesn't matter.

ZEYNEP TUFEKCI: Just like a cafeteria, you're going to get people to eat more if you serve unhealthy food again and again and again before they even have a chance to finish their plate.

LIMBONG: And its effect - she adds that it's increasingly schoolchildren turning to YouTube for information and getting fed these types of videos. She wrote about the issue a while back in The New York Times.

TUFEKCI: And I got flooded with examples and comments. Like, parents would put their kid in front of YouTube with a video from NASA - right? - some very innocuous, interesting content, which YouTube is full of. And 45 minutes later, the kid would come back and say, Mom, the moon landing never happened.

LIMBONG: YouTube is rolling out these changes to its recommendation machine gradually in the United States first, affecting less than 1 percent of all YouTube content. But Tufekci says it's around the rest of the world - Brazil, Indonesia, Sri Lanka - where misinformation on YouTube truly has the power to destabilize societies.

Andrew Limbong, NPR News. Transcript provided by NPR, Copyright NPR.

Andrew Limbong is a reporter for NPR's Arts Desk, where he does pieces on anything remotely related to arts or culture, from streamers looking for mental health on Twitch to Britney Spears' fight over her conservatorship. He's also covered the near collapse of the live music industry during the coronavirus pandemic. He's the host of NPR's Book of the Day podcast and a frequent host on Life Kit.