Data Integrity: Trends, Pitfalls, Solutions, and Why DI Is Still Making Headlines


2019-bl-data-integrity-page-image

Many working within the life sciences industry might describe themselves as more than familiar with the term data integrity . Companies have had years to properly ensure that their critical GxP data is complete, consistent, and accurate. So, why does data integrity keep popping up?! Why does this term continue to linger, and relentlessly consume more and more real estate on your favorite industry news feeds? Perhaps assuring data integrity in a world with an ever-increasing amount of data and data systems is a more formidable challenge than initially expected. Based on inspection results from the U.S. Food and Drug Administration (FDA), this may be the case.

As a brief refresher, in the GxP environment data having integrity is defined as data being complete, consistent and accurate. Data is qualified to meet this definition if, including its metadata, it is Attributable, Legible, Contemporaneously recorded, Original (or a true copy), and Accurate (ALCOA). A more in-depth definition of data integrity can be found in the FDA‘s latest data integrity guidance released in December 2018.

ALCOA _Graphic 1_ _002_ 

To reveal the key reasons why data integrity is and will remain a key area of focus for the life sciences, let’s briefly do what any good investigator would. Let’s look at the history of data integrity regulations and enforcement trends to identify the common pitfalls companies are experiencing. From there, we can discuss how to correct these issues and assure an effective data integrity program.

Assuring #dataintegrity in a world with ever-increasing data and data systems may be a more formidable challenge than initially expected, says @MCMasterControl http://bit.ly/2HlOHiX

I. Regulations

Data integrity regulations are not new. There has not been a new law pertaining to data integrity since 21 CFR Part 11 was enacted in 1997. Since then, the FDA has released several guidance documents detailing the ‘current thinking’ on data integrity. What does this tell us? Data integrity, or the lack thereof, is a key concern of the FDA because it continues to present an unacceptable level of risk to the public.

II. Enforcement Trends and Pitfalls

You only have look at the number of data integrity-related observations over the past few years to understand the FDA’s concern. In the past 3 years, data integrity observations have increased to account for 15 percent, then 18 percent, and finally, 25 percent of warning letters last year, and the trend is increasing.

Warning letters bar chart _Graphic 2_ _002_

 

Within these warning letters we see a recurring theme. Incomplete production record data, incomplete lab data, and deficient access controls (computer) regularly top the list as the most common data integrity observations. As with many observations, deficiencies are commonly exemplified using evidence about a specific system. However, we need to recognize that the deficiency of a single system may indeed be a deficiency of that system’s overall quality management system (QMS) and data integrity assurance strategy. For this reason, and given the sheer number of data integrity related observations, it makes sense to assume that companies are struggling at a quality governance level as to how to assure data integrity.

III. Solutions

The following are the most effective solutions that I have found in establishing and maintaining a compliant data integrity program. These solutions are based on years of consulting and remediation experience. I must add that the solutions offered here should be evaluated for your business and assessed using a risk-based approach. As all businesses are different, the way the following suggestions are implemented will differ according to the unique characteristics of your business.

  1. Recognize that data has a life cycle process

    The most easily adopted solutions are those that fit into a familiar model. Take new machine design requirements, for example (i.e., new information to process and new physical design); these requirements easily fit into existing system development models and are therefore easy to adopt. But in what model does data fit? Even though data systems may be designed with data integrity controls, data commonly moves through multiple systems as it proceeds along its life cycle. The flow of data from system to system is very similar to another model we are already familiar with, the manufacturing process (product life cycle).

    Just like a product being manufactured, data can move through machines as it is processed and verified. For this reason, it makes sense to assure data integrity (data quality) as you would assure the quality of the product being manufactured.

    Recommended Strategy

    • Take time to consider how data moves through machine and paper systems, changing as it moves from creation to end-of-use.
    • Consider your current methods for assuring quality and how they might be adopted/modified to assure the integrity of data.

       

  2. white paper icon

    Enjoying this article? You may also enjoy this White Paper:

    Stop Managing Paper In A Data-Driven World

    Download Free White Paper
  3. Document your GxP data flows (their data life cycle)

    Now that we recognize that data does have a life cycle, we need to make sure that this life cycle is documented. In a manufacturing facility, this activity typically starts with a product batch record, then leads to other GxP data areas not limited to the laboratory, maintenance and training departments. GxP data flow documentation ultimately provides a single reference where system designers and data users can identify what data is critical and can be considered the primary record.

    Documenting data flows also has a significant benefit unrelated to data integrity. It commonly results in increased efficiency by enabling companies to recognize correlations between data that leads to process improvements.

    Recommended Strategy

    • Map GxP data flows, including each life cycle step.
    • Maintain the data flow document as a reference for system designers and data users.

       

  4. Formally plan your data integrity effort

An effort to implement a data quality system has more in common with making a cultural change than completing a short project. For this reason, a plan must be created and approved to maintain the project’s momentum and detail the steps you will be taking to assure data integrity. Consider yourself in the regulator’s position: would you have a more favorable impression of a company progressing toward the completion of a documented plan, or one that simply says, “We are dedicated to data integrity” yet has no plan?

When creating your plan, carefully consider your current state, the data-integrity ‘status’ of existing systems, as well as the scope of your data integrity effort. Lack of clarity in these areas commonly leads to problems.

Recommended Strategy

  • Formally plan your data integrity project detailing how all existing systems and newly-installed systems will be controlled to assure data integrity.
  • Plan to update your QMS to assure all new systems will be installed and maintained such that data integrity is controlled. This governance requires:
    • Specification of what data is GxP data.
    • Specification of GxP data life cycles (typically by data type).
    • The data integrity risks of any GxP system are sufficiently remediated using the appropriate quality controls (e.g., testing and maintenance).
  • Plan to assess existing systems and remediate any systems found to have insufficient data integrity controls.

When considering #dataintegrity in a QMS, it is critical that #lifesciences first consider data as the product, says @MCMasterControl #, says @mastercontrol http://bit.ly/2HlOHiX

Fitting data integrity controls into an existing GxP quality system is difficult. We are accustomed to implementing non-process requirements by incorporating them into a machine specification. But data is different. It must be assured in and across machines, as well as in and across procedures. This is why it is critical that we start to consider data as the product. Only by taking this step can we recognize that the process quality knowledge we already possess can also be used to assure data integrity. Until this is done, it is unlikely that data integrity will yield its position as a GxP hot topic any time soon.



 

2019-bl-author-matt-brawnerMatt Brawner is a data integrity subject matter expert (SME) and Director of Sales Execution at Sequence in Morrisville, North Carolina. With a passion for helping others achieve success in their data integrity endeavors, he provides data integrity training and consulting when not working to improve Sequence’s sales execution process. Brawner has over 18 years of experience working with drug and medical device companies within the biopharmaceutical industry. As a Lean/6 Sigma Black Belt with experience holding various roles within quality, engineering, automation, IT, manufacturing and organizational excellence, he strives to deliver innovative and high-value quality solutions. Brawner was most recently recognized for his leadership in a successful regulatory action remediation, as well as consolidating quality systems (QMS) following a merger between two global companies. He attributes his success to the capable teams with which he has worked. Brawner can be reached at mbrawner@sequenceqcs.com.

 


 

Watch Related Videos
Download Free Resources