What EMA’s New AI and Data Integrity Guidelines Signal for Clinical Research
In July 2025, the European Commission opened a public consultation on updates to EudraLex Volume 4, with revisions developed by the EMA GMDP Inspectors Working Group, in cooperation with PIC/S.
The consultation covers three documents:
While written for Good Manufacturing Practice (GMP), these updates reflect a broader regulatory direction that will inevitably influence Good Clinical Practice (GCP). Together, they emphasize that digital innovation must be governed by transparency, validation, and accountable human oversight.
Documentation as a Living Element of Quality
The revised Chapter 4 reframes documentation as a cornerstone of quality — not a static archive, but a living, evolving element of operational integrity. Records, whether paper, digital, image, or audio, must remain complete, legible, and retrievable throughout their lifecycle. In addition, the text introduces formal expectations for data-governance systems within the quality framework and calls for risk-based controls to safeguard accuracy and authenticity.
For clinical environments, this thinking is familiar. ICH E6 (R3) carries the same expectation: data must be both reliable and reconstructable, and integrity should be built into every process rather than proven after the fact. Together, these frameworks reflect a shift toward proactive documentation design — where trust is built through clarity, consistency, and control.
Computerized Systems: Continuous Validation and Accountability
The updated Annex 11 places Quality Risk Management at the heart of computerized-system oversight. Systems must remain validated and under control across their entire lifecycle, with defined processes for configuration, access, and data protection. Furthermore, the annex clarifies that responsibility for data integrity extends to — but cannot be transferred to — external providers.
These expectations align closely with ICH E6 (R3), which calls for proportionate oversight of computerized systems that impact data quality. Validation is no longer a one-time event; it is a continuous demonstration of control ensuring systems remain fit for purpose, secure, and capable of supporting reliable decisions.
Artificial Intelligence and Digital Decision-Making
The new Annex 22 marks the first formal regulatory framework for Artificial Intelligence in regulated use. It defines expectations for intended-use documentation, performance validation, independent testing, explainability, and qualified human oversight. Importantly, dynamic or generative AI models are excluded from critical applications, reinforcing that consistency, reliability, and accountability remain essential.
Although developed for manufacturing, these principles are highly relevant to clinical innovation. Any AI that influences regulated decisions — such as document review, data classification, or quality oversight — should operate within a transparent and governed framework. These expectations align with Just in Time GCP’s AI Credibility Charter, which holds that AI supports decision-making rather than replaces it, and that every output must be explainable, traceable, and subject to qualified human review.
What This Means for Clinical Quality
Across all three documents, several themes resonate strongly within the clinical-research landscape:
Data integrity is technology-independent.
Every record, regardless of format, must remain complete, legible, and protected from alteration or loss.
Validation is continuous.
Oversight and documentation must demonstrate control throughout each system’s lifecycle.
Risk drives proportionate control.
Oversight should align with impact on subject safety and data reliability.
Human judgment remains central.
Automation can enhance efficiency, but qualified professionals remain accountable for interpretation and outcomes.
These principles mirror ICH E6 (R3) — emphasizing critical-to-quality factors, risk-informed oversight, and integrated controls across people, process, and technology. Together, they point to a future where innovation and integrity advance together.
Looking Ahead
The EMA consultation represents more than an administrative revision. It signals a shared regulatory understanding: that credibility in digital systems depends not on complexity, but on transparency, validation, and human accountability.
For clinical research, this creates both clarity and opportunity. By embedding these principles today, sponsors can modernize digital oversight while maintaining alignment with emerging expectations. Ultimately, the future of clinical compliance will favor organizations that demonstrate not only what technology can achieve, but how consistently and responsibly it performs under scrutiny.
Because true innovation doesn’t replace expertise — it amplifies it.