© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

Experts Call For CSI Reform At San Antonio Forensics Event

Paul Flahive
Texas Public Radio
Foxfury lighing demonstrates its crime scene kits at an International Association for Identifications event.

Updated Aug. 8


While 1,400 crime scene investigators, fingerprint examiners, and forensic pathologists learned about the latest in forensic technology last week in San Antonio at the International Association for Identification's annualInternational Educational Conference, questions remain around the current science that can lead to convictions.

A 2009 report from the National Academy of Sciences hammered the forensic science community, saying many of its assumptions and processes lacked enough proof. Other highly critical reports followed.

“There’s a danger that if we aren’t successful, there will be more reports of falsely convicted individuals and guilty perpetrators are allowed to go free because the quality of the forensic science was not convincing to a judge or jury,” said Mark Stolorow, a 40-year veteran of forensic science who now works for the National Institute of Standards and Technology, which started establishing higher standards in 2014.

He said over the last 100 years, policemen were asked to explain things like bullet and fingerprints matches. Officers with science backgrounds developed methods to establish those matches they thought were trustworthy, “and as long as the judge accepted their testimony as expert testimony, rather then going through rigorous peer-reviewed scientific development, they were effectively utilizing techniques they were testing by trial and error,” he added.

Stolorow helped NIST set up the Organization of Scientific Area Committees for Forensic Science to help change that.

Five hundred and fifty practitioners, representing 300 of the nation’s more 400 crime labs, are creating experiment-based standards for everything from bite marks to trace fabric analysis. In four years, they have developed 11 standards, with 200 more in development.

“It’s not fast.” Stolorow said, “it may be 20, 30 years before all the research — out of the 103 research gaps that we’ve identified and posted online on the OSAC website — can actually be finished.”

Meanwhile, some practitioners aren’t waiting for OSAC to create evidence-based solutions.

“We just woke up one day and we just got fed up,” said Henry Swofford, chief of latent prints for the Department of Defense’s Defense Forensic Science Center. “Why has nothing changed in the past 30 years. Why do we still not have a tool?”

He and his team set out to make that tool, andFRstat is the result. It’s software that gives fingerprint examiners a statistical analysis of their match.

Credit Contributed Photo / National Institute of Standards and Technology
National Institute of Standards and Technology
FRstat is software that gives fingerprint examiners a statistical analysis of their match.

Based on an algorithm and a 2,000-known-match database, it lets them walk into a court and testify that a print is 10 times or 90,000 more likely to be a match then not a match.

Swofford said it took three years to develop the program. They started using it last year and published a peer-reviewed article in April.

Jessica LeCroy implemented the process for DOD, and it’s been used in 400 cases so far.

“There is a shock value of ‘I thought my evidence was stronger than the result I’m getting,’ ” LeCroy said.

But ultimately, it’s resulting in better fingerprint analysis, she said.

Anthony Koertner, a fingerprint examiner for the DOD, helped develop FRstat and said it’s helping reform the culture. When he was training 10 years ago, the culture was very “fire and brimstone,” he said.

“This is how it is in latent prints: You cannot make a mistake. You have zero error rate,” he said. “We hated all that stuff. We didn’t believe it.”

Credit Henry Swofford / Contributed Photo
Contributed Photo
FRstat is software that gives fingerprint examiners a statistical analysis of their match.

Fear of losing your job over a mistake stifled self-reporting, which is dangerous because juries still accept fingerprint testimony largely without question, Swofford said.

“If they’re almost too trusting,” he said, “is it our obligation to clarify the limitations of how much weight these juries should place on the reliability of the evidence?”

Swofford said it was, and now with FRstat, they can tell them with a greater degree of certainty.

Thirty-five federal, state and local crime labs are evaluating the software for possible use.

Meanwhile, OSAC’s higher standards are starting to gain traction, Mark Stolorow, said.

“We’re doing everything we can to push on every door, to try and gain influence and gain a partnership with as many stakeholder groups as possible in order to reach the tipping point,” he said.

Two state laboratories have adopted the higher standards, but more importantly, courtrooms are starting to use them.

That’s a big deal, Stolorow said, but added that for others, he can see how progress might be “like watching the grass grow.”

Paul Flahive can be reached at paul@tpr.org or on Twitter@paulflahive

CORRECTION: NIST was incorrectly identified in the story. It is the National Institute of Standards and Technology.

Paul Flahive can be reached at Paul@tpr.org