Big-Data Policing Threatens To Compromise Civil Liberties, Further Social Inequalities
Law enforcement agencies are increasingly able to follow peoples' digital trails in the name of public safety, using advanced technologies to identify and investigate potential criminal activity, but at what cost?
Proponents of big-data policing say it has the potential to reduce bias and improve efficiency but research indicates data-intensive surveillance and predictive policing can widen the scope of the criminal justice system, imperil privacy and civil liberties, undermine public trust and further entrench existing social inequalities.
Police across the country are already under heavy scrutiny for tactics that disproportionately affect people of color. How does law enforcement use of big data and algorithms further impact the citizens they're sworn to serve and protect?
In what ways do police, data brokers and technology companies collaborate to monitor individuals in the purported pursuit of justice?
Are problems with big-data policing inherent to the technology itself or more so to do with how law enforcement agencies choose to deploy them? Can algorithms be objective or do they conceal existing biases?
What are some possible reforms to maximize public safety without infringing on individual civil liberties?
- Sarah Brayne, Ph.D., assistant professor of sociology at the University of Texas at Austin and author of "Predict and Surveil: Data, Discretion, and the Future of Policing"
- Nicol Turner Lee, Ph.D., senior fellow in Governance Studies, director of the Center for Technology Innovation and co-editor-in-chief of TechTank at The Brookings Institution
"The Source" is a live call-in program airing Mondays through Thursdays from 12-1 p.m. Leave a message before the program at (210) 615-8982. During the live show, call 833-877-8255, email email@example.com or tweet @TPRSource.
*This interview was recorded on Monday, November 30.