Two kinds of data integrity problems cast shade on research results and erode trust in scientific inquiry: questionable research practices and flat-out misconduct, a data integrity expert says. And while intentional misdeeds can result in regulatory, or even criminal, action, both categories can cause irreparable damage to clinical trials, companies and individuals.

images

Most data integrity problems in research fall into the questionable practices category and are not due to intentional acts, said Donna Kessler, research integrity officer at Duke University, during a recent CenterWatch webinar. They result from professionals who take shortcuts, rush research or just make bad decisions.

Questionable research practices (QRP) are generally understood across the research community and include failure to properly cite sources and methods, sloppy or incomplete documentation of results or, most commonly, failure to retain data and results.

Intentional misconduct, however, is defined by regulators. Researchers who fabricate data, falsify existing data or plagiarize the work of others can be subject to legal penalties and or debarment from participating in government-funded research.

Misconduct and questionable practices across the spectrum are more prevalent than some think, Kessler said, citing research that found 2 percent to 3 percent of researchers admit to fabricating or falsifying data, 14 percent have observed a colleague falsifying or fabricating data, and 33.7 percent admit to using QRPs. More than 70 percent observed others committing QRPs, Kessler added.

Research misconduct can be a result of bad judgment, lack of training or poor oversight, she said. In other cases, misconduct stems from competition for funding and recognition, or a sense of entitlement.

Researcher Eric T. Poehlman, for example, served a year in prison and was barred from getting any additional federal grants after he falsified data in 17 NIH grant applications and 10 of his papers. He later said he was motivated by his desire to advance as a respected scientist. He felt his area of study was important enough to warrant what he considered “minor” misconduct.

The repercussions of QRPs and misconduct are many, Kessler said. They can not only skew results and erode public trust, they can do serious clinical harm, as well.

In one of the most notorious and harmful cases of research misconduct, British surgeon and medical researcher Andrew Wakefield in 1999 published a paper claiming a link between childhood vaccines and autism. Not only did he falsify his data, his sample size was only 12 children, and he did not have ethics board approval for the study.

Wakefield was ultimately barred from practicing medicine and the study was retracted, but the ripple effects and the debate over childhood vaccines continues. Vaccination rates dropped globally thanks in part to intense press coverage of the Wakefield study. Further studies have shown there is no link between vaccines and autism.

Both the Wakefield and Poehlman cases and others show the potential long-term effects of misconduct on public health and the harm it can do to the credibility of researchers and scientific knowledge, Kessler stressed.

Not all data integrity issues are covered in federal regulations, and Kessler stressed that researchers and institutions must go further to protect the reliability and validity of research results. “The regulations are really the floor; they’re the starting point. To have integrity in research, you have to go beyond the regulations,” she said.

You can’t really prevent wrongdoing if a researcher is bent on misconduct, Kessler said. But researchers and institutions can step in when problems are caused by lack of training, sloppy methods or bad judgment, especially if the problems are caught early in the research process.

You also can prevent intentional misconduct by providing integrity education and training, reviewing and improving research and data practices, and encouraging reporting and investigation of possible misconduct.

To encourage integrity review, Kessler said, make reporting easy, with anonymous or online reporting options, stand firm against retaliation and take credible allegations seriously. Organizations also can implement tools for mentors and trainees, such as plagiarism detection software.

“Raising awareness of these types of wrongdoing is helpful, but you have to have some more active programs and changes in order to create a culture and a climate where integrity is at the forefront,” Kessler said.

To listen to the webinar, go here: https://bit.ly/2V2m8uR.

Intentional or Accidental, Bad Data Can Kill Trials and Careers

Leave a Reply

Your email address will not be published. Required fields are marked *

Visit Us On TwitterVisit Us On Google PlusVisit Us On LinkedinVisit Us On Facebook