© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'The Cleaners' Looks At Who Cleans Up The Internet's Toxic Content

Content moderators are responsible for determining what we see and what we don't on social media.
Courtesy of gebrueder beetz filmproduktion
Content moderators are responsible for determining what we see and what we don't on social media.

Thousands of content moderators work around the clock to ensure that Facebook, YouTube, Google and other online platforms remain free of toxic content. That can include trolling, sexually explicit photos or videos, violent threats and more.

Those efforts — run by both humans and algorithms — have been hotly contested in recent years. In April, Mark Zuckerberg spoke before a congressional committee on how Facebook would work to cut down the prevalence of propaganda, hate speech and other harmful content on the platform.

"By the end of this year we're gonna have more than 20,000 people working on security and content review," Zuckerberg said.

The Cleaners, a documentary by filmmakers Hans Block and Moritz Riesewieck, seeks to get to the bottom how, exactly, that work is done. The film follows five content moderators and uncovers their jobs actually entail.

"I have seen hundreds of beheadings. Sometimes they're lucky that it's just a very sharp blade that's being used to them," one content moderator says in a clip from the film.

Block and Riesewieck explored more of the harsh realities that come along with being a content moderator in an interview with All Things Considered.


Interview Highlights

On a Facebook content moderator's typical day

They see all these things which we don't want to see online, on social media. That could be terror, that could be beheading videos like the ones the voice was talking about before. It could be pornography, it can be sexual abuse, it could be necrophilia, on one hand.

And on the other hand it could be content which could be useful for political debates, or to make awareness about war crimes and so on. So they have to moderate thousands of pictures every day, and they need to be quick in order to reach the score for the day. ... It's sometimes up to so many pictures a day. And then they need to decide whether to delete it or to let it stay up.

On Facebook's decision to remove the Pulitzer Prize-winning "napalm girl" photo

This content moderator, he decides that he would rather delete it because it depicts a young, naked child. So he applies this rule against child nudity, which is strictly prohibited.

So it is always necessary to distinguish between so many different cases. ... There are so many gray areas which remain, in which the content moderators sometimes told us they have to decide by their gut feelings.

On the weight of distinguishing harmful content from news images or art

It's an overwhelming jump — it's so complex to distinguish between all these different kinds of rules. ... These young Filipino workers there have a training from three to five days, which is not enough to do a job like this.

On the impact of content moderators being exposed to toxic content daily

Many of the young people are highly traumatized because of the work.

The symptoms are very different. Sometimes people told us they are afraid to go into public places because they're reviewing terror attacks every day. Or they're afraid to have an intimate relationship with his boy or girlfriend because they seeing sexual abuse videos every day. So this is kind of the effect this work has....

Manila [capital of the Philippines] was a place where the analog toxic waste was sent from the Western world, has been sent there for years on container ships. And today the digital garbage is brought there. Now thousands of young content moderators in air conditioned office towers are clicking through the infinity toxic sea of images and tons of intellectual junk.

Emily Kopp and Art Silverman edited and produced this story for broadcast. Cameron Jenkins produced this story for digital.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

Ari Shapiro has been one of the hosts of All Things Considered, NPR's award-winning afternoon newsmagazine, since 2015. During his first two years on the program, listenership to All Things Considered grew at an unprecedented rate, with more people tuning in during a typical quarter-hour than any other program on the radio.