I recently attended the Data Integrity Validation conference, Aug. 14-15, in Waltham, Massachusetts, hosted by the Institute of Validation Technology (IVT). Having done a little homework and being able to recite the components of ALCOA (attributable, legible, contemporaneous, original, accurate) without looking, I felt like I had a pretty good understanding of data integrity — cue derisive coughs and snickers. After the first 10 minutes, it became apparent how sorely mistaken I was. Overall, the conference was extremely valuable and featured a lineup of highly knowledgeable presenters. Here are my top five highlights of the conference.
A funny thing about people is they like to purchase products that work. Backtracking through a product’s development life cycle, adherence to data integrity best practices is a big part of making products that function as promised. With health care-related products, data integrity is particularly critical because it contributes to product trustworthiness and patient safety.
To that end, the following are some of the key Good Practice (GxP) guidelines surrounding data integrity:
Data integrity regulations are documented in 21 CFR Parts 211 and 212. Noncompliance to any of the regulations, intentional or otherwise, leads to things like Form 483s, warning letters and, of course, headlines:
During a workshop facilitated by Chimnoy Roy, industry consultant, ValGenesis Inc., the concept of data integrity was discussed, dissected and debated in an effort to identify where processes breakdown and incidents occur. A detailed whiteboard exercise revealed some of the root causes of data integrity issues:
Enjoying this article? You may also like this White Paper:
"21st-Century Manufacturing: An Investment in Digital Data and Operational Efficiency"Download Free White Paper
The facilitator concluded the session by discussing some starting points for improving data integrity processes. Topping the list of action items was establishing a strong culture of quality and compliance with full buy-in and participation from every person in the company. This includes defining clear expectations of all organizational staff, empowering employees to speak up, establishing stricter risk assessment controls, and keeping abreast of regulatory trends and guidelines.
“Of course we trust you, so we don’t need to see all your data records,” said no regulatory agency, ever. Here are just a few of the less-than-adequate data management practices observed by inspectors that become the stuff warning letters are made of:
A few of the presenters addressed audit trails as a way to identify and fill data integrity gaps. By definition, an audit trail is a secure, computer-generated, time-stamped electronic record of a computerized system’s application processes that run within it and a record of the activity of the system users. In simpler, less verbose terms, total transparency in data management is what we’re going for.
Audit trails are a good way to avoid running afoul of data integrity compliance. However, a full audit trail review can be time-consuming and costly. One tactic to accelerate the process and make it more economical is implementing a risk-based approach. While this practice streamlines the audit trail, it still works to ensure data integrity compliance. Some of the risk considerations you can focus on include:
With this method you can evaluate and act on issues based on the level of risk: high, medium and low.
Effective implementation of audit trails begins by developing and documenting policies that give clear direction about compliance to current good manufacturing practices (CGMP). These policies are accompanied by employee training, corrective and preventive action (CAPA) and prompt remediation activities.
It didn’t really take attending a conference on data integrity to learn that computer system validation (CSV) is not a favorite pastime of life sciences companies. For most manufacturers of health care products, CSV is the very definition of pain point. Nevertheless, it is necessary for complying with another regulatory acronym, computer software assurance (CSA) for manufacturing and quality system software.
In addition to protecting public health, the FDA is also responsible for advancing public health. The agency pursues this objective by helping speed innovations that make medicines and therapies more effective, safer and more affordable. However, this endeavor tends to be difficult when life sciences companies are hesitant to adopt digital technologies.
According to the FDA’s Francisco (Cisco) Vicenty, program manager, Case for Quality at FDA’s Center for Devices and Radiological Health (CDRH), “The Medtech industry’s high focus on meeting regulatory requirements versus adopting best quality practices has the potential to increase risk to patients. This compliance‐centric approach has resulted in quality issues and has hampered innovation in manufacturing and product development practices.”(2)
In one session, a panel of quality and technology experts offered attendees an incentive to not hide under the bed when the notion of a technology upgrade comes up. The discussion centered around a new FDA guidance that is intended to resolve the common pain points with CSV.
As a spoiler alert, some of the changes manufacturers can look forward to include:
A few of the approaches to software assurance pulled from the actual guidance include:
Consider the relatively simple and routine task of driving. For many people, this involves an occasional glance at the road while most of their attention is given to a cell phone. Recognizing an increase in unsafe driving habits, automobile manufacturers are developing technology to protect us from ourselves. More cars are rolling off the line equipped with features designed to keep us in our lane and automatically apply the brakes when we’re about to hit another car, pedestrian or storefront. The possibilities for human error are systematically being eliminated by artificial intelligence technology.
During her session titled “Cognitive Disruption: Holistic IT Solutions,” Karen Ginsbury, consultant with PCI Pharma Services, passionately advocated disruption and digitization. From atop her soapbox, she pointed out some of the reasons why data integrity remains one of the most common inspectional findings:
On the subject of automating out opportunities for human error and the temptation to “massage” data, Ginsbury also advised attendees to make way for blockchain. This technology is a distributed ledger for maintaining a permanent, tamper-proof record of transactional data. By design, blockchain is resistant to the deletion or modification of data, which can be useful for many purposes — namely audit trails. Bottom line, Ginsbury stressed that GxP-compliant automation and digital technologies will eliminate data integrity issues, and we should be using them.
It appears that the FDA is finding it necessary to lead the charge for advancing technology in the life sciences arena. In a document outlining its Software Precertification Program, the agency acknowledged that evaluating the safety and effectiveness of software as a medical device (SaMD) requires a more in-depth review process than what the current regulations prescribe. “The application of FDA’s longstanding regulatory framework to software can impede access to new and improved software-based medical products. An agile regulatory paradigm is necessary to accommodate the faster rate of development and potential for innovation in software-based products. It’s important for public health to address the distinctive aspects of digital health technology.”(3)
Data integrity is clearly a major contributor to developing quality products. However, as I learned early in the conference, ensuring the integrity of enormous amounts of data can be complex and difficult. Fortunately, the development of new technologies and best practices to simplify the processes is ongoing. In summary, here are a few tips for achieving and maintaining compliance with data integrity requirements: