© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

UT Professor Talks Propaganda, Election Interference

Photo by Soumil Kumar from Pexels. CC0: https://bit.ly/3gugBJn
Photo by Soumil Kumar from Pexels. CC0: https://bit.ly/3gugBJn

Advances in technology have allowed for an ever-increasing barrage of propaganda and fake information. Although social media companies have made an attempt to tamp down falsehoods, the problem is only getting worse.

University of Texas Professor Dr. Samuel Woolley is the author of the new book "The Reality Game: How The Next Wave Of Technology Will Break The Truth." TPR's Jerry Clayton spoke with Dr. Woolley to talk about the future of propaganda and disinformation.

Highlights of the interview with Dr. Samuel Woolley

On the 2016 election interference
I think we've always known the propaganda and disinformation exist. It's just that we hadn't really thought a whole lot as a society about the ways in which they involved. You know, society have gotten really used to this notion of propaganda being something that comes from the top down, from states to citizens over broadcast media via TV or newspapers and things like that. There hadn't been a whole lot of thinking, broadly speaking, about the ways in which propaganda had transferred or translated via social media. And so I think we are a little blindsided in 2016 because we realized really quickly that social media tools have been co-opted — by those powerful states and governments and intelligence agencies to try to manipulate public opinion. But also that there had been kind of a opening up of who could do propaganda and how.

On foreign involvement in U.S. elections
There's kind of the big three when it comes to thinking about the countries that are involved in trying to manipulate public opinion in the United States during elections using digital tools and other beacons. Those big three are Iran, China and Russia. One of the trickiest things about online propaganda is the fact that it's hard to figure out who is spreading it and how they're spreading it. So protecting provenances is a challenge... Almost every time we've looked into it, we've been able to find some evidence of different governments using using this stuff. I used to be over at the University of Oxford at the computational propaganda project. And at last count, our analysis there of the countries that were doing this was up to around 80 different countries and governments that were making use of these kinds of tools and tactics.

Who is involved in computational propaganda
PR firms and what they call dark digital PR firms and other manipulative actors like them, like marketing firms that are that are kind of leading the charge in this kind of manipulation. Most often those those kinds of groups are working as contractors or consultants for political groups, whether it's campaigns, politicians, militaries. And so they are sort of the innovators in the space, but there are a lot of other people who are working to manipulate public opinion using social media. So extremist political groups do this stuff inside. ISIS and other terror groups have done this for quite a long time. Or you've seen right wing extremists in the United States really begin to use these kinds of tools to try to spread their ideology. But suffice to say that computational propaganda, as we call it — which is the use of automation and algorithms and the manipulation of public opinion — is a tool that could be used by almost any group now.

On what's being done by social media to combat disinformation
In the last year or two, firms, especially Facebook and Twitter, have taken a very proactive approach to trying to combat propaganda on their platforms. Up until 2016, the firms largely ignored this stuff. And that was a really big problem. My research team, other researchers that had been discovering the fact that disinformation and propaganda was flowing on these sites had let them know well, prior to 2016 this was happening. But the social media companies just didn't really take it seriously. They almost willfully ignored it, I would say. But now they've they've taken some some some time to think through what it takes to get rid of the most problematic forms of disinformation and political harassment and manipulation like other forms of manipulation there. There's an elephant in the room here, which is that in the United States, we still don't have any real meaningful federal government regulation to stop the flow of really harmful disinformation or manipulation on these sites. And that's got to happen because right now the social media companies are making these moves on their own. But who's to say that the next social media company that comes along or that's already coming to prominence, aka Parler, will just completely disregard these kinds of these kinds of fail safes and checks.

On whether the U.S. Government would use this type of propaganda on its own citizens
It's it's been difficult to track or ascertain whether or not the government itself, people within the US government, have actually used this against the American public. But what we can say is for certain is that political campaigns and certain politicians around the United States have certainly made use of tactics of computational propaganda and digital disinformation It's also true that political groups, the United States, have already used this against the US citizenry. And so that brings us to a larger point, which is that we don't really have a whole lot of law that prevents social media from being turned around and weaponized against the citizenry to try to misinform them. And we need more laws to do that.

Dr. Samuel Woolley is an assistant professor in the School of Journalism at the Moody College of Communication at the University of Texas-Austin. He is a Knight Faculty Fellow and the Program Director of propaganda research at the Center for Media Engagement (CME) at UT.

TPR was founded by and is supported by our community. If you value our commitment to the highest standards of responsible journalism and are able to do so, please consider making your gift of support today.

Jerry Clayton can be reached at jerry@tpr.org or on Twitter at @jerryclayton.