© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Technologies To Create Fake Audio And Video Are Quickly Evolving

MARY LOUISE KELLY, HOST:

Amid all the talk of fake news, the technologies to create fake audio and video are quickly evolving. NPR's Tim Mak has been looking into this, and he brings us this report of how these technologies could impact our politics.

TIM MAK, BYLINE: This is not a real audio clip of President Trump.

(SOUNDBITE OF ARCHIVED RECORDING)

COMPUTER-GENERATED VOICE: South Korea's finding, as I have told them, that their talk of appeasement with North Korea will not work. They only understand one thing.

MAK: Trump did write that on Twitter, but he never once said that. A Montreal startup called Lyrebird has released a product which allows users to create an audio clip of anyone saying anything. Here's the company using a fake clip of President Obama to market their technology.

(SOUNDBITE OF ARCHIVED RECORDING)

COMPUTER-GENERATED VOICE: They want to use this technology to change the life of everyone that lost their voice to a disease by helping them recover this part of their identities. Let's help them achieve this goal.

MAK: Again, Obama never actually said that. These technologies process the limited number of distinct sounds in the human voice, and using a process called machine learning, it then imitates them.

YOSHUA BENGIO: We can record a few minutes of somebody's voice and then be able to generate speech of that person speaking, saying things that have been typed in the computer.

MAK: Professor Yoshua Bengio is an adviser to Lyrebird. He touts such positive uses as restoring voices to those who have lost them to illness.

BENGIO: I think it's better if companies, which work to, you know, try to do it in a way that's going to be beneficial for society, actually build those products and try to put as much as possible of the safeguards that I think are necessary and also raise the awareness rather than doing these things in secret.

MAK: And so how much does this matter?

HANY FARID: I don't think it's an overstatement to say that it is a potential threat to democracy.

MAK: Hany Farid is the chair of computer science at Dartmouth College. An example that illustrates Farid's concern took place during the 2008 election, when rumors circulated that there was a tape of Michelle Obama using a derogatory term for white people. There's no evidence that it existed, but using these technologies, a fake could be made.

It can also give public figures a chance to call real audio a forgery. Farid recalls those "Access Hollywood" tapes during the 2016 campaign.

FARID: Eighteen months ago, when that audio recording of President Trump came out on the bus, if that was today, you can guarantee it, he would've said it's fake. And he would've had some reasonable credibility in saying that as well 'cause there was no video associated with it.

MAK: The threat of falsified audio, video and photo is a national security issue that has gotten the interest of the Defense Advanced Research Projects Agency, or DARPA, which is part of the Department of Defense.

David Doermann runs the media forensics program at DARPA. His disaster scenario is a mass misinformation campaign - creating an event that never even occurred.

DAVID DOERMANN: And that might lead to political unrest or riots or, at worst, some nations acting all based on this bad information.

MAK: Doermann's team is putting together a platform that automatically determines whether images, video or audio has been manipulated. Here's Mark Kozak, an engineer who works for PAR Government Systems. He helps create falsified audio that Doermann and his team then use to develop their platform.

MARK KOZAK: It wasn't that long ago that you could easily assume that if you have photographic evidence of something, that can be used as evidence and no one's going to question it. I think people have to learn to be questioning everything that you hear and see.

MAK: And we may be headed for an endless back-and-forth between those who create fake media and those who want to catch them. Tim Mak, NPR News, Washington.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Tim Mak is NPR's Washington Investigative Correspondent, focused on political enterprise journalism.