close
close

Solutions to scientific fraud: Should misconduct in research be illegal?

You've probably never heard of cardiologist Don Poldermans, but experts who study scientific misconduct believe thousands of people could die because of him.

Poldermans was a prolific medical researcher at the Erasmus Medical Center in the Netherlands, where he analyzed standards of care for cardiac surgery and published a number of authoritative studies from 1999 to the early 2010s.

One of the key questions he investigated was: Should patients be given a blood pressure-lowering beta-blocker before certain heart operations? Poldermans' research found the answer was yes. European medical guidelines (and to a lesser extent US guidelines) recommend this accordingly.

The problem? Poldermans' data was allegedly falsified. A 2012 investigation by Erasmus Medical School, his employer, into allegations of misconduct found that he “used patient data without written permission, used fictitious data and … submitted to conferences [reports] which knowingly contained unreliable data.” Poldermans acknowledged the allegations and apologized, but stressed that the use of fictitious data was unintentional.

Sign up here to learn more about the big, complicated problems facing our world and the most effective ways to solve them. Broadcast twice a week.

Following these revelations, a new meta-analysis was published in 2014 on whether beta-blockers should be used before heart surgery. It found that a course of beta-blockers increased the likelihood of someone dying within 30 days of heart surgery by 27 percent. This means that Poldermans's recommended practice, based on falsified data and adopted in Europe based on his research, actually dramatically increased the likelihood of people dying during heart surgery.

During the years from 2009 to 2013, when these misguided guidelines were in place, tens of millions of heart surgeries were performed in the United States and Europe. A provocative analysis by cardiologists Graham Cole and Darrel Francis estimated that there were 800,000 deaths, compared to the number that would have occurred had best practices been implemented five years earlier. While that exact figure is hotly contested, a 27 percent increase in mortality from a common procedure over years can represent an extraordinary death toll.

I learned about Poldermans' case when I contacted some researchers who were dealing with scientific misconduct and asked them a provocative question: Should scientific fraud be prosecuted?

Unfortunately, fraud and misconduct in the scientific community are not nearly as rare as one might think. We also know that the consequences of getting caught are often underwhelming. It can take years to get a bad paper retracted, even when the flaws are obvious. Sometimes scientists accused of falsifying their data file frivolous lawsuits against their colleagues who bring it to their attention, silencing anyone who speaks out about bad data. And we know that this behavior has high stakes and can dramatically affect treatment options for patients.

In cases where scientific dishonesty literally claims human lives, should it not be appropriate to resort to the criminal justice system?

The question of whether research fraud should be a crime

In some cases it can be difficult to distinguish scientific misconduct from negligence.

If a researcher fails to apply the appropriate statistical correction for testing multiple hypotheses, he or she will likely obtain some false results. In some cases, researchers are strongly motivated to be negligent in this regard by an academic culture that values ​​non-null results above all else (i.e., researchers are rewarded for finding an effect even if it is methodologically flawed, while being unwilling to publish sound research if they do not find an effect).

However, I would argue that criminalising such behaviour is not a good idea. It would have a serious chilling effect on research and would likely make the scientific process slower and more legalistic – which would also lead to more deaths that could be avoided if science were freer.

The debate about whether research fraud should be criminalized therefore tends to focus on the most obvious cases: the deliberate falsification of data. Elisabeth Bik, a scientist who studies fraud, made a name for herself when she showed that photos of test results in many medical journals were clearly manipulated. This is not a harmless mistake, but rather provides a kind of basis for how often manipulated data is published.

While scientific fraud is technically covered by existing laws that prohibit lying in grant applications, for example, in practice scientific fraud is almost never prosecuted. Poldermans eventually lost his job in 2011, but most of his papers were not even retracted and he faced no further consequences.

But in response to growing awareness of the prevalence and harm of fraud, some scientists and science fraud watchdog organizations have proposed changing this. A new law narrowly focused on scientific falsification could more clearly draw the line between negligence and fraud.

The question is whether legal consequences would actually help with our fraud problem. I asked Bik what she thought about proposals to criminalize the misconduct she is investigating.

Her response was that while it's not clear whether criminalization is the right approach, people should understand that there are currently almost no consequences for wrongdoers. “It's maddening to see people cheat,” she told me. “And even when it comes to NIH grants, the penalties are very low. Even for people caught cheating, the penalty is very lenient. They're not eligible to apply for new grants for the next year, or sometimes the next three years. It's very rare that people lose their jobs because of this.”

Why is that? Basically, it's a problem of incentives. Institutions are embarrassed when one of their researchers commits misconduct, so they'd rather impose a lenient punishment and stop researching. There's little incentive for anyone to get to the bottom of the misconduct. “If the worst consequence for speeding was a cop saying 'Don't do that again,' everyone would speed,” Bik told me. “That's the situation we have in academia. Do whatever you want. If you get caught, it takes years for the investigation to start.”

In some ways, a statutory statute is not the ideal solution. Courts also take years to deliver justice in complex cases. They are also not well suited to answering detailed scientific questions and would almost certainly rely on scientific institutions to conduct research—so what really matters are those institutions, not whether they are affiliated with a court, a nonprofit, or the NIH.

But in sufficiently serious cases of misconduct, it seems to me that there is a great benefit to having an institution outside of academia working to get to the bottom of these cases. A well-designed law that allows for the prosecution of scientific fraud could remove the overwhelming incentives to let misconduct go unpunished and to continue.

If there were ongoing investigations by an external authority (such as a prosecutor's office), it would no longer be so easy for institutions to protect their reputation by sweeping fraud cases under the carpet. But the external authority would not necessarily have to be a prosecutor's office; an independent scientific review board would probably also suffice, Bik said.

Ultimately, law enforcement is a blunt tool. It might help create accountability in cases where no one is motivated to do so – and I think in cases of misconduct that have claimed thousands of lives, it would be a matter of justice. But it is neither the only nor necessarily the best way to solve our fraud problem.

However, so far, efforts to create institutions within academia to monitor misconduct have had only limited success. At this point, I would consider it positive if there were efforts to also entrust external institutions with monitoring misconduct.