Data quality within the clinical research enterprise can be defined as the absence of errors that matter and assurance that the data is good enough to fit the purpose.
Source document verification (SDV) is commonly used as a quality control method in clinical research. It is disproportionately expensive and often leads to questionable benefits. Its efficiency was found to be low in several studies.
Mitchel et al. analyzed the impact of SDV and queries in a study utilizing direct data entry at the time of the clinic visit. The results showed that only 3.9% of forms were queried by clinical research associates and only 1.4% of forms had database changes as a result of queries.
One of the most recent retrospective multistudy data analyses revealed that only 3.7% of eCRF (electronic case report form) data are corrected following initial entry by site personnel. Another publication, analyzing the magnitude of data modifications across 10,000 clinical trials has reflected that the proportion of data modified from the original data entry was <3%.
Additional results clearly demonstrate minimal impact of errors and error corrections on study results and study conclusions, with diminishing impact as the study size increases. Furthermore, it suggests that, on average, <8% SDV is adequate to ensure data quality.
These results led to a growing consensus that risk-based approaches to monitoring, focused on risks to the most critical data elements and processes necessary to achieve study objectives, are more likely to ensure subject protection and overall study quality than routine visits to all clinical sites and 100% data verification
Bioforum, in alliance with The Center for Academic Studies in Or Yehuda analyzed data from 4 studies in different clinical development phases and different therapeutic areas in order to evaluate whether the conclusions on the site monitoring process are applicable to Data Management processes as well. The evaluation focused on data validation achieved by programmed data checks (edit checks) and manual data review.
Examination of 17,975 queries (both programmed and manual checks) raised in data management processes to sites in order to clarify data discrepancies revealed that the rate of data changes following those queries was significantly higher (almost 70%) than the rate of changes found in studies examining the site monitoring processes.
Examination of the query origin revealed that queries raised by programmed checks (edit checks, 79.2%) led to a significantly higher rate of changes than queries raised during manual
Results of this analysis reflect a significant impact of data management processes on study data points and, consequently, on data quality and possibly on study results. Bioforum’s Data Management department processes are constantly evolving toward effectiveness without compromising on the data quality and in accordance with regulation requirements and best practices including Good Clinical Data Management Practices (GCDMP), ICH E6(R2), CFR part 11, etc. Risk-based approaches to data management processes, and specifically programmed edit checks, are carefully examined to reduce possible negative effects.