Data Quality and Monitoring in Clinical Trials: Reducing Errors and Ensuring Compliance
Rich Wzorek, Director | New Products & Services | Almac Clinical Technologies
Mark Dickinson, Technical Manager| New Products & Services | Almac Clinical Technologies
Capturing usable source data is one of the most critical aspects of a clinical trial. With the global reach of many clinical trials, clinical sites can face unique challenges in ensuring the data is accurate, consistent, and fit for its intended use. Leveraging the right technology helps site users capture, validate, and aggregate trial data more effectively, offering clear advantages over manual processes.
This article illustrates how tailored reporting solutions address the unique requirements of different data sets, ensuring both regulatory compliance and operational efficiency.
Technology’s Role
Given the sheer volume of transactions that create meaningful data throughout a clinical trial’s duration, manually validating, analysing, and aggregating this data would require large-scale execution of manual procedures. Ensuring compliance with these high-risk manual processes requires documented training and ongoing monitoring.
By entering transactions into validated systems that not only integrate but also interact with each other, the need for widespread manual processes is significantly reduced. Manual processes have their place but should be the exception, not the norm.
The data required for a clinical trial is very often just a small subset of what the site requires to provide the participant with the standard of care they deserve. This often has site users frequently switching between multiple systems, many of which are often not connected and require a certain level of manual transcription between systems. This can lead to data quality issues at worst, and a high volume of queries and discrepancies at best.
For example, if a site user transcribes lab values incorrectly, it could lead to erroneous dosing calculations and potentially even provide the incorrect dosage to a patient. In contrast, using technology to copy lab values from one system to another greatly reduces the likelihood of human error.
Data Ownership
Given the need for multiple systems and different types of data, it is important to analyse not only the data itself but also how other systems utilise the information. Take, for example, a subject being screened at a site. In isolation, it’s a seemingly straightforward activity to record, but this single event can have a myriad of downstream impacts across the clinical trial. Here are some considerations:
- Who is responsible for data stewardship to ensure that the data is attributable, timely, and accurate?
- Which system “owns” that screening data and can be considered the “source of truth”?
- How and where in the workflow will the data be validated?
- How should that newly screened subject reflect in dashboards, reports, and webpages used by site users?
- Which other systems care about the subject being screened?
The answers to these questions can help guide how data should be stored and how other systems can expect to receive that data.
When designing a system, proper data validation requirements are essential for maintaining consistency and quality. These validations also enable sites to conduct more effective analysis of datasets which can encompass multiple trials or even entire clinical trial portfolios. Furthermore, restricting non-source systems from modifying source data helps maintain data integrity and simplifies the process of reconciling data during key milestones such as interim analysis and database lock.
Reporting
To achieve optimal outcomes, it’s crucial to design solutions that address specific reporting needs. Purpose-built tools streamline workflows and deliver precise answers, minimising manual intervention and reducing errors. Conversely, generic solutions often demand more manual effort and additional validation steps, increasing the risk of mistakes and inefficiencies.
Reporting on a trial’s data illustrates this principle in practice. Clinical sites often rely on exports or visualisations that pull information from multiple systems and combining that data accurately is essential to producing reports that satisfy both regulatory requirements and site‑level needs. When reporting tools are designed specifically for these nuanced use cases, they can apply the right logic for each dataset and reduce the need for manual reconciliation. In contrast, presenting all data in a uniform, generalised format may appear simpler at first glance but often leads to more exceptions, more back‑and‑forth validation, and a greater burden on site users to resolve inconsistencies.
Best Practices for Data Utilisation in Clinical Trials
| Intended use of report | Sample dataset size | Utilisation of data |
| Upcoming patient visits | 1000 rows | Viewable/actionable on web app |
| Current depot inventory | 100000 rows | Exportable to Excel for further analysis |
| Quarterly audit trail of all site-related events | 1M+ rows | Full-scale data transfer |
As shown above, the intended use, dataset size, and other problem-specific requirements can vary significantly across different types of site-level information. For instance, the count of quarterly events in an audit trail export is distinctly larger than an export of upcoming patient visits. A system not explicitly designed to handle the unique needs of each dataset could very easily result in an improperly sized and inefficient generalised solution. Purpose-built reporting tools tailored to each scenario will be more effective in delivering the desired business outcomes.
Conclusion
In summary, effective use of technology can have very tangible benefits in the all-important goal of capturing accurate and usable data during the lifespan of a clinical trial. Strong data ownership, stewardship, and validation practices are essential to maintaining data quality, and right-sizing solutions to process well-defined categories of information ensure that the data remains fit for its intended purposes, supporting both regulatory compliance and operational efficiency.