We’ve all heard the stories of deliberate data fraud in clinical trials: the investigator who created fictional subjects, the lab technician who duplicated test data, the coordinator who altered records. What makes people think they can get away with it?
In most cases, data errors are the result of ignorance of procedures or sloppiness in practices that can be corrected with improved training and oversight. But left unattended, an error-tolerant environment can open the door to more serious problems.
Two experts shared their personal horror stories with attendees at the MAGI Clinical Research Conference West last week, suggesting sites focus on catching problems when they are minor before they snowball and turn into flagrant research misconduct.
“These things start out small, but they can grow,” said Paula Brust, a quality and assurance auditor at QA Partners. “They can easily become something more significant.”
“Deal with things when they start going wrong,” agreed Sara Meeder, director of human research participant protections at the Maimonides Medical Center. “Otherwise, you will be in a terrible mess.”
The evolution of misconduct goes through three stages, Brust said. In the “innocent ignorance” stage, staff may make mistakes like backdating consent signatures or discarding source documents after transcription. The “surprising sloppiness” stage features problems like forgetting to obtain consent, estimating data and taking short cuts.
If these behaviors aren’t nipped in the bud, they become common practice and may lead some individuals to take advantage, leading to the “malicious malfeasance” stage, in which legitimately obtained data may be changed, undesirable data may be omitted, or data may be entirely fabricated.
Fraud often is not detectable without checking multiple sources for the data, particularly sources for pathology reports and endoscopy procedures, Brust said.
Brust told the story of a 45-site study for which she was project manager. The coordinator at one site was found to be manipulating and fabricating data, with the investigator signing off on it. The errors started out simply, Brust said, with monitoring visits cancelled at the last minute, or patients refusing to schedule lab tests and histology tests not being performed per the protocol.
This seeming lack of attention to detail quickly turned to misconduct when the coordinator took a biopsy sample from one subject and split it to create an additional record for a fictional subject. Ultimately, it was discovered that four of the study’s seven patients did not actually exist.
But even less elaborate misconduct can damage a trial’s integrity, such as changing entry data to qualify subjects, stretching of entrance criteria or coaching patients on their diary entries.
All the warning signs will be there if the trial coordinator or site manager knows where to look. Brust cites the following red flags:
- Missing documents;
- Incomplete case report files;
- Frequent appointment changes;
- Notes entered out of chronological order;
- Informed consent documents with similar patient signatures;
- Photocopied source documents with no original;
- Data that is too consistent from patient to patient;
- No mistakes made or corrected;
- Drug containers returned in pristine condition; and
- Perfect protocol compliance.
Brust advised monitors to be vigilant for these practices and to watch for attempts by the PI or trial staff to limit access to study charts and records.
Difficulties with principal investigators also can signal problems. If the PI has only minimal involvement in the study or a poor working relationship with site staff, Brust said, it pays to look into the reasons why and the resulting consequences.
“Observe the relationship,” Brust advised monitors, “because it could be bad, or if it’s too good that is also a sign something is wrong.”
Meeder told of one study in which a Spanish-speaking patient was enrolled without being provided with a Spanish-language informed consent form. When questioned, the coordinator said she had provided the “gist” of the document for the patient’s family. Meeder reported the problem to the PI, only to be told she was “overreacting.”
Ultimately, it is up to trial personnel to use their common sense and to be able to pick out and escalate innocent ignorance issues before they turn into flat-out fabrication.
“Be brave in clinical research,” said Meeder. “Because we protect research participants, you have to stand up and say something when things go wrong.”
By Colin Stoecker