background image for GxP Lifeline
GxP Lifeline

Top 5 Takeaways From IVT’s Data Integrity Validation Conference


I recently attended the Data Integrity Validation conference, Aug. 14-15, in Waltham, Massachusetts, hosted by the Institute of Validation Technology (IVT). Having done a little homework and being able to recite the components of ALCOA (attributable, legible, contemporaneous, original, accurate) without looking, I felt like I had a pretty good understanding of data integrity — cue derisive coughs and snickers. After the first 10 minutes, it became apparent how sorely mistaken I was. Overall, the conference was extremely valuable and featured a lineup of highly knowledgeable presenters. Here are my top five highlights of the conference.

#1 Quality Data Helps Build Quality Products

A funny thing about people is they like to purchase products that work. Backtracking through a product’s development life cycle, adherence to data integrity best practices is a big part of making products that function as promised. With health care-related products, data integrity is particularly critical because it contributes to product trustworthiness and patient safety.

To that end, the following are some of the key Good Practice (GxP) guidelines surrounding data integrity:

  • Data records are expected to be accurate, complete, intact and maintained within their original context.
  • The data companies generate and collect is used to make decisions about product quality, safety and efficacy.
  • Companies are obligated to implement strategies to detect when data is not accurate, reliable and, for that matter, truthful, and effectively mitigate potential risks to product quality and patient safety.

Data integrity regulations are documented in 21 CFR Parts 211 and 212. Noncompliance to any of the regulations, intentional or otherwise, leads to things like Form 483s, warning letters and, of course, headlines:

#2 Data Governance Lacks Understanding and ... Governance

During a workshop facilitated by Chimnoy Roy, industry consultant, ValGenesis Inc., the concept of data integrity was discussed, dissected and debated in an effort to identify where processes breakdown and incidents occur. A detailed whiteboard exercise revealed some of the root causes of data integrity issues:

  • Company culture of fear, blame, lack of awareness and overcomplicated processes.
  • Poorly communicated and misunderstood policies and standards.
  • Inadequate internal audits or audit trail functionality is disabled.
  • Data integrity not part of the design process.
  • Pressure to avoid incidents such as out-of-specification (OOS) situations that delay production.

The facilitator concluded the session by discussing some starting points for improving data integrity processes. Topping the list of action items was establishing a strong culture of quality and compliance with full buy-in and participation from every person in the company. This includes defining clear expectations of all organizational staff, empowering employees to speak up, establishing stricter risk assessment controls, and keeping abreast of regulatory trends and guidelines.

#3 Audit Trails Bridge Gaps in Data Integrity

“Of course we trust you, so we don’t need to see all your data records,” said no regulatory agency, ever. Here are just a few of the less-than-adequate data management practices observed by inspectors that become the stuff warning letters are made of:

  • Falsifying data.
  • Omitting, editing or discarding data.
  • Re-running tests (i.e., testing into compliance).
  • Not recording data in real time.
  • Back-dating data.

A few of the presenters addressed audit trails as a way to identify and fill data integrity gaps. By definition, an audit trail is a secure, computer-generated, time-stamped electronic record of a computerized system’s application processes that run within it and a record of the activity of the system users. In simpler, less verbose terms, total transparency in data management is what we’re going for.

Audit trails are a good way to avoid running afoul of data integrity compliance. However, a full audit trail review can be time-consuming and costly. One tactic to accelerate the process and make it more economical is implementing a risk-based approach. While this practice streamlines the audit trail, it still works to ensure data integrity compliance. Some of the risk considerations you can focus on include:

  • Impact on product quality, patient safety and data integrity.
  • Critical quality attributes (CQAs) that monitor or control critical process parameters (CPPs).
  • Regulatory requirements.
  • GxP relevance.
  • System functions.
  • Probability of incident.

With this method you can evaluate and act on issues based on the level of risk: high, medium and low.

Effective implementation of audit trails begins by developing and documenting policies that give clear direction about compliance to current good manufacturing practices (CGMP). These policies are accompanied by employee training, corrective and preventive action (CAPA) and prompt remediation activities.

#4 Software Validation Can Actually Be Less Painful

It didn’t really take attending a conference on data integrity to learn that computer system validation (CSV) is not a favorite pastime of life sciences companies. For most manufacturers of health care products, CSV is the very definition of pain point. Nevertheless, it is necessary for complying with another regulatory acronym, computer software assurance (CSA) for manufacturing and quality system software.

In addition to protecting public health, the FDA is also responsible for advancing public health. The agency pursues this objective by helping speed innovations that make medicines and therapies more effective, safer and more affordable. However, this endeavor tends to be difficult when life sciences companies are hesitant to adopt digital technologies.

According to the FDA’s Francisco (Cisco) Vicenty, program manager, Case for Quality at FDA’s Center for Devices and Radiological Health (CDRH), “The Medtech industry’s high focus on meeting regulatory requirements versus adopting best quality practices has the potential to increase risk to patients. This compliance‐centric approach has resulted in quality issues and has hampered innovation in manufacturing and product development practices.”(2)

In one session, a panel of quality and technology experts offered attendees an incentive to not hide under the bed when the notion of a technology upgrade comes up. The discussion centered around a new FDA guidance that is intended to resolve the common pain points with CSV.

As a spoiler alert, some of the changes manufacturers can look forward to include:

  • More testing and considerably less documentation.
  • More critical thinking and risk-based, agile approaches to testing.
  • Reduced software validation testing cycle times.
  • Significant reduction in test script and tester errors.

A few of the approaches to software assurance pulled from the actual guidance include:

  • Focus on value for the organization and end users, not the auditor.
  • Leverage existing activities and supplier data. Don’t reinvent the wheel — take credit for work already done.
  • Leverage the use of process controls to mitigate risk.
  • Use computer system validation (CSV) tools to automate validation assurance tasks.
  • Use electronic data capture and record creation, as opposed to paper documentation.

#5 Digitization Prevents Opportunities for Human Errors

Consider the relatively simple and routine task of driving. For many people, this involves an occasional glance at the road while most of their attention is given to a cell phone. Recognizing an increase in unsafe driving habits, automobile manufacturers are developing technology to protect us from ourselves. More cars are rolling off the line equipped with features designed to keep us in our lane and automatically apply the brakes when we’re about to hit another car, pedestrian or storefront. The possibilities for human error are systematically being eliminated by artificial intelligence technology.

During her session titled “Cognitive Disruption: Holistic IT Solutions,” Karen Ginsbury, consultant with PCI Pharma Services, passionately advocated disruption and digitization. From atop her soapbox, she pointed out some of the reasons why data integrity remains one of the most common inspectional findings:

  • Data integrity noncompliance is rampant because companies don’t implement systems that detect and prevent unauthorized access and data manipulation.
  • We are collecting colossal amounts of data, but it’s scattered across disparate databases, and we don’t have the tools to analyze it and make it work for us.
  • Big data analytics technology enables us to uncover critical information that we can apply as knowledge for process improvement, but we’re not using near the amount of data we could be.
  • Digitized data management processes are capable of recalculating entire risk portfolios, determining root causes and detecting fraudulent behavior in minutes — long before issues arise that can negatively impact your organization.

On the subject of automating out opportunities for human error and the temptation to “massage” data, Ginsbury also advised attendees to make way for blockchain. This technology is a distributed ledger for maintaining a permanent, tamper-proof record of transactional data. By design, blockchain is resistant to the deletion or modification of data, which can be useful for many purposes — namely audit trails. Bottom line, Ginsbury stressed that GxP-compliant automation and digital technologies will eliminate data integrity issues, and we should be using them.

Technology Has Big Name Backers

It appears that the FDA is finding it necessary to lead the charge for advancing technology in the life sciences arena. In a document outlining its Software Precertification Program, the agency acknowledged that evaluating the safety and effectiveness of software as a medical device (SaMD) requires a more in-depth review process than what the current regulations prescribe. “The application of FDA’s longstanding regulatory framework to software can impede access to new and improved software-based medical products. An agile regulatory paradigm is necessary to accommodate the faster rate of development and potential for innovation in software-based products. It’s important for public health to address the distinctive aspects of digital health technology.”(3)

Ensuring Data Integrity

Data integrity is clearly a major contributor to developing quality products. However, as I learned early in the conference, ensuring the integrity of enormous amounts of data can be complex and difficult. Fortunately, the development of new technologies and best practices to simplify the processes is ongoing. In summary, here are a few tips for achieving and maintaining compliance with data integrity requirements:

  • Data Integrity is a shared responsibility. It’s important to foster a company culture that reinforces quality data management.
  • Reviewing results from summary reports and spreadsheets won’t detect issues. You need to review both raw and analyzed data.
  • Implement methods to efficiently trace data and metadata.
  • Perform audit trails.
  • Data stored for extended periods can deteriorate over time. It’s important to use data storage, transfer and archive measures that keep data secure and prolong its lifespan.

References

  1. FDA Threatens Criminal Action Against Novartis Over Faulty Data Used In Application for $2.1 Million Gene Therapy.” Retrieved from https://www.cnbc.com/2019/08/06/fda-novartis-knew-its-application-for-2point1-million-gene-therapy-included-errors.html
  2. “What Is FDA Thinking? We Asked, They Answered!” Retrieved from http://axendia.com/blog/2018/02/14/what-is-fda-thinking-we-asked-they-answered/
  3. U.S. Food and Drug Administration (FDA), “Developing Software Precertification Program: A Working Model.” Retrieved from https://www.fda.gov/media/113802/download

2019-bl-author-david-jensen

David Jensen is a content marketing specialist at MasterControl, where he is responsible for researching and writing content for web pages, white papers, brochures, emails, blog posts, presentation materials and social media. He has over 25 years of experience producing instructional, marketing and public relations content for various technology-related industries and audiences. Jensen writes extensively about cybersecurity, data integrity, cloud computing and medical device manufacturing. He has published articles in various industry publications such as Medical Product Outsourcing (MPO) and Bio Utah. Jensen holds a bachelor’s degree in communications from Weber State University and a master’s degree in professional communication from Westminster College.


Free Resource
MasterControl Validation Strategy FAQ

Enjoying this blog? Learn More.

MasterControl Validation Strategy FAQ

Download Now
[ { "key": "fid#1", "value": ["GxP Lifeline Blog"] } ]