Agencies tasked with regulating and monitoring medical therapies have an enormous responsibility to the citizens of the world. They are tasked with creating and enforcing consumer centric regulations without creating a regulatory burden for manufacturers that could threaten product availability or medical progress.
The challenges they face include technology that increases in diversity and complexity at a blinding pace, supply and distribution chains that cross borders and climates, and a continually evolving global marketplace.
Given those common objectives and challenges, the benefits of decades spent undertaking initiatives to align standards and requirements, harmonizing language and methods and establishing shared inspection programs are easy to see. However, aligning their thoughts with such a wide array of technologies, cultures and concerns is not an easy task. There is some level of divergence on almost every topic.
But the more successful their efforts are, the less the burden is on the public and the private sectors, resulting in medicines getting to every market sooner, at the lowest possible cost.
But there are always many ways to view a single reality and many useful models of quality. Reaching agreement on the best possible practice is often difficult, and when viewpoints diverge, industry is not surprised.
What does get the attention of industry are the very rare times that global regulatory bodies are in complete and utter agreement. And if the initiatives that result are geared toward improvement as opposed to alignment — and they reach unanimous agreement on the guidance that should be given — it can really only mean one thing; everyone, everywhere is doing something poorly.
That is where we find ourselves now; the regulators of the world are having a conversation with the manufacturers about an area in which we are all underperforming. And that area is establishing systems that protect and ensure the integrity of the data that we produce and use, which regulators and public rely upon to reflect the reality of the state of our control over manufacturing and product quality.
The U.S. Food and Drug Administration (FDA) refers to data integrity as; “the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate.”
The FDA has been actively defining detailed expectations for data integrity since 1997 when they supplemented 21 CFR Part 211’s predicate rules on records and record keeping, by publishing Title 21’s first set of regulations directly defining the rules for electronic records and electronic signatures (21 CFR part 11). However, more than a decade passed before their first guidance document on this topic was released.
In addition to the FDA’s recently published guidance on data integrity, industry has also received formal guidance documents from The Pharmaceutical Inspection Co-operation Scheme (PIC/S), the Medicines and Healthcare products Regulatory Agency (MHRA), the European Medicines Agency (EMA) and the World Health Organization (WHO).
It’s clear that everyone is publishing on the same topic, at the same time and they are saying the same thing. Our data lacks integrity and the regulators want that to change.
Enjoying this article? You may also like this White Paper:
"6 Big Data Concepts Every Life Sciences Executive Needs to Understand"Download Free White Paper
The quality and completeness of data generated by drug and medical device manufacturers is critical to regulators, manufacturers and consumers alike. Modern regulatory models aren’t based on direct sampling of final product. They are based on assuming the manufacturer’s claims of quality product are accurate because the systems that generate data indicating quality are robust and in control. The data allows inspectors to render opinions on the success of the processes that produced it. These assumptions mean regulators don’t measure our products directly, and they allow us to sample in a representative manner.
This inspection model treats data as a mirror; a tool that reflects reality.
What happens to an inspection model of that kind, and the consumer protection the inspection model is expected to deliver, when the mirror is broken?
The recent flurry of guidance coming from the world’s regulators is in response to alarming trends that are being seen in every country that indicate even if our mirrors are not broken, they are clearly warped. The integrity of the data is less than it should be, and regulators don’t believe it can be relied on to accurately reflect reality.
The cause of this trend isn’t clear yet. Perhaps the integrity of everyone’s data fell off a cliff at the same time, or more likely, perhaps it has taken 20 years for regulators to adapt their organizational skill sets and inspection tools from those geared toward paper records to those geared toward electronic data and records.
And if the latter is the cause, it means that for the first time since the late ‘90s, inspectors are capable of evaluating the impact of our software development, configuration, validation and implementation choices. And they don’t like what they are seeing.
The content of the guidance documents is a direct function of the deficiencies that regulators have seen in the field, and they align on most of the fundamental topics addressed.
The guidance documents provide definitions and specific answers to common questions. But for the purpose of this article, let’s focus on the following fundamental points of agreement:
Prevention Is Key
System Access and Security Must Controllable and Controlled
Raw Data, True Copies and Reprodcution – Know The Difference
The Presence of Modified Data Must Be Impossible to Overlook and The History Must Be Clear
System Validation Is Non-Negotiable
Predicate Rules for Data Retention and Reconciliation Matter
E-Signatures Are Optional, But If Used, Context Must Be Clear
When reviewing global regulatory guidance, the following truths become self-evident:
Data integrity is critical to the support of product quality and patient safety during the product life cycle. We must identify and understand the current gaps in our practices and implement procedural and technological controls that will not only improve the integrity of the data, but ensure that data is a true reflection of reality.
Gina Guido-Redden is a quality and regulatory professional with over 25 years of domestic and international industry experience. She is the co-founder and chief operations officer of Coda Corp USA, which provides consultancy services to pharmaceutical, biologics and medical device firms.
Guido-Redden’s history specializes in the areas of facility start up, regulatory compliance and remediation, quality system development, mentorship and training, quality system design, and implementation and management.
She is also a quality systems subject matter expert (SME), frequent seminar presenter, and content contributor to industry publications, including GAMP’s White Paper on Part 11, The Journal of Validation Technology, New Generation Pharmaceuticals, Computer Validation Digest, and MasterControl’s GxP Lifeline. Coda Corp USA is an enterprise partner of MasterControl.