A funding crunch for scientific research is creating incentives for scientists to cut corners and even occasionally to cheat.
This is one of the findings in a new report about scientific integrity from the National Academies of Sciences, Engineering and Medicine.
Sometimes scientists adopt sloppy practices that can lead to false conclusions. This can hamper progress in science. And taxpayer dollars are on the line.
Consider the story of a genetics lab at the University of Wisconsin. Mary Allen was a graduate student in that lab in 2005. One postdoctoral researcher had been laid off because of a funding shortage, and the professor in charge of the lab was scrambling to keep the laboratory afloat by seeking more grants.
But Allen and her five fellow graduate students noticed that a grant document didn't accurately describe work that had been previously done in the lab.
"We weren't certain it was falsification," Allen says. "It could have been a mistake. The results sounded slightly better than they really were."
Allen and her fellow graduate students faced a difficult decision.
"If it's really falsification, we may not have the ability to keep going in grad school, or they may ask us to go to another lab and start new projects," she says. "Either way, that would be a huge hit to everybody's career."
The graduate students decided to talk to the department chairman about the issues they'd found. Ultimately the professor quit, and later pleaded guilty to scientific misconduct. This story was first reported by Science magazine in 2010.
The students in the lab liked their professor and thought she was doing good work. So why did the professor tinker with the grant report in the first place?
"I think one of the reasons she did it was she was under so much stress about getting funding for the students," Allen says. So, "she decided tweaking the data a little to make it look better would allow her to get a grant and therefore fund us."
The former professor did not respond to NPR's request for comment.
Stories of outright misconduct like this are rare in science. But the pressures on scientists manifest in many more subtle ways.
If people are working as hard as they can and as smart as they can, they may look for other ways to get a further edge to succeed in their careers, says social scientist Brian Martinson at the HealthPartners Institute in Minneapolis.
"Some proportion of people might find themselves making bad decisions and cutting corners," he says.
Martinson has surveyed university scientists and asked them about behaviors that he calls undesirable. These can include poor data handling, not keeping tight control of patient privacy and bending other rules.
"Almost half of the scientists who responded to our survey said they had engaged in at least one of those activities in the prior three years," Martinson says. And many said they had violated multiple standards.
"When you get people engaging in that many kinds of consistent, undesirable practices, this can certainly undermine the quality of the work," he says, "and therefore the ability to reproduce it."
Many studies that get published in the biomedical literature can't be reproduced in other labs. This slows progress in medical research, because scientists spend a lot of time chasing down false leads. That hampers the search for understanding disease and seeking treatments.
Some of this is unavoidable, simply because scientists are exploring the edges of knowledge. But there's plenty of room for improvement.
"If you've got people who are cutting corners, if you've got people who are doing things to undermine the quality of research, you've got to ask why," Martinson says.
Sometimes scientists simply don't know better. Occasionally scientists willingly cheat. But often these behaviors are driven by bad incentives in the system.
"I think what we're really talking about here is human nature," says C.K. Gunsalus, director of the National Center for Professional and Research Ethics at the University of Illinois. She and Martinson both served on the National Academies' committee on research integrity.
"If you're in an environment that has very high stakes and very low chance of success, those are two of the predictors of environments in which people are going to cheat," Gunsalus says.
That's exactly the environment where many scientists find themselves today. There are strong career incentives to bend the rules, by exaggerating accomplishments in a grant proposal, for example.
"One of the rules in life is if you reward bad conduct you're going to get a lot more bad conduct," Gunsalus says, "because even people with Ph.D.s can figure out what you're rewarding and say, 'Ooh! If that's what it takes to get ahead, I can do that.' "
But if scientists see everyone else playing by the rules, they are more likely to do so as well. That's why Gunsalus, who swoops into troubled academic departments to fix dysfunction, looks to see whether the leaders are setting a good example. If they are, others are likely to follow.
"People do fundamentally care about the rigor and integrity of research because that's how progress happens," she says. "I mean, you can't scam the facts or nature, right?"
And in addition to scientific progress and tax dollars, careers are at stake here.
Mary Allen says only three of the six grad students in her uprooted lab ended up getting Ph.D.s, despite the many years all of them had put in. It took her 8 1/2 years to complete hers.
Allen recently got a job as a research assistant professor at the University of Colorado, Boulder. So now she finds herself in the same position as her former professor — on the quest for scarce grant funding.
In 2005, when the misconduct took place, "it was the worst funding NIH had seen," Allen says, "and we've only seen it go downhill. So it's even worse than it was before."
Copyright 2020 NPR. To see more, visit https://www.npr.org.