Surge of Data Integrity Violations Irritating the FDA


2019-bl-the-surge-of-manufacturing-data-integrity-violations-page-image

If regulatory agencies had to review every syllable of data at every regulated manufacturing facility, few products would actually get to market. This is why agencies like the U.S. Food and Drug Administration (FDA) rely on manufacturers to provide complete and accurate information in their submissions. Naturally, the FDA tends to get agitated when companies try to pass off partial, fabricated or manipulated data. The agency even threatened to pursue criminal charges when Novartis submitted faulty data in its $2.1 million gene therapy application.(1)

The FDA’s patience is further tried when follow-up inspections or visits to a company’s other sites result in repeated warning letters for the same infractions. Despite data integrity sitting squarely in the regulatory spotlight, it still holds the record for the most common reason for warning letters. A report produced by Deloitte cites that data integrity violations account for over 70 percent of the warning letters issued globally.(2)

This is a sign that life sciences companies petitioning for a regulatory thumbs-up on their products should plan on devoting more time, effort and even technology to data integrity.

Nonconformances

Some of the observations noted in actual warning letters include:

  • Omitting, editing or deleting data and including only passing results in the data presented for batch review.
  • Sampling or retesting to achieve a specific result or to overcome an unacceptable result.
  • Failing to retain original raw data.
  • Not recording data at the time a procedure is completed.
  • Backdating record entries.
  • Disabling audit trails.

Common Misconceptions

Regulatory organizations provide various resources with guidelines and instructions for complying with data integrity requirements. However, there still seems to be some misconceptions about the concept, including:

  • There is no need to maintain original data or the data’s history because it won’t be reviewed.
  • Auditors won’t check recycle bins.
  • Out-of-specification (OOS) reporting is merely paperwork and does not impact patient safety if not completed.
  • Violations are limited to only fraud and misrepresentation.
  • Sharing login credentials is necessary for ensuring someone is always able to move processes forward.
  • The whole industry works this way.

The FDA’s Views on the Subject

Consumers don’t have the option of reviewing active pharmaceutical ingredient (API) data, Certificates of Analysis (CoA), clinical study information or any other information involved in a product’s development. They count on the FDA to cover those bases and ensure drug products meet the requirements for quality, safety and efficacy.

To be confident that no corners were cut or data was falsified in the development of a drug product, the FDA expects manufacturers to ensure all data meets the guidelines outlined in the ALCOA acronym: attributable, legible, contemporaneous, original and accurate. This includes all metadata (data about the data) and data history as spelled out in the current good manufacturing practice (CGMP) requirements.

In a statement addressing issues regarding the quality of generic drugs, former FDA Commissioner Scott Gottlieb highlighted the following points about the agency’s efforts to ensure product safety:(3)

    white paper icon

    Enjoying this article? You may also enjoy this White Paper:

    "6 Big Data Concepts Every Life Sciences Executive Needs to Understand"

    Download Free White Paper
  • Consumers must have confidence in the quality and safety of generic medicines.
  • We closely analyze reams of data to ensure the quality and safety of manufacturing throughout a product’s life cycle. When investigating possible drug safety concerns, a multidisciplinary team reviews the data.
  • The FDA takes strong compliance and enforcement actions when issues are observed.
  • Warning letters to human drug manufacturers regulated by the FDA’s Center for Drug Evaluation and Research (CDER) have steadily increased over the past four years. In fact, in fiscal year (FY) 2018, CDER issued nearly five times as many warning letters to human drug manufacturers as it did in FY 2015.

Based on these assertions, companies that are unable to demonstrate good data integrity practices can count on experiencing delays when seeking regulatory approval.

Where Data Integrity Breaks Down

Pharmaceutical manufacturing environments have a lot of moving parts, numerous personnel and a considerable amount of data throughout an extensive supply chain. In business environments, employees are expected to multitask and produce high levels of output in a short amount of time with a very narrow margin of error.

There are several components that either individually or collectively undermine an organization’s ability to effectively manage data. Human error, improperly calibrated or maintained equipment, and the lack of clear procedures for detecting and reporting issues are just a few of the culprits. Ensuring data integrity under these circumstances is a tall order.

Still, because of the high stakes involved with drug products, all data must be recorded, stored and remain traceable throughout a product’s life cycle. However, data is commonly gathered or created by multiple people using different processes. In these cases, the data is often in varying formats and spread out across several locations such as spreadsheets, paper documents and department-specific databases.

Compiling data for reporting or audit purposes often involves tracking down logbooks, sifting through stacks of bins and file folders, or clarifying data that was written on scratch paper. If there are any gaps along the way, retracing steps to identify and resolve issues can cause major delays. This scenario is a recipe for lost files, inaccurate or incomplete data sets and a plethora of data integrity violations.

Data management is already painstaking, and it gets more complicated as products become more complex. In today’s pharmaceutical manufacturing landscape, a lot more data is available, but it’s useless without the right tools to organize and analyze it.

Preserving the Integrity of Data

While there are many factors involved in an organization’s data integrity issues, a major contributor is human error. For example, many laboratory processes for determining product quality rely on a significant amount of human input and subjective assays to produce quality control data. It’s not that analysts lack knowledge or skill; it’s mostly because many processes simply allow for errors to occur.

The intent of advancing technology for CGMP purposes is to simplify processes, improve productivity and reduce opportunities for mistakes. Production timelines are always tight, and delays are extremely costly. In pharmaceutical manufacturing, automating as many tasks as possible alleviates the bottlenecks and setbacks caused by data management errors.

With so many areas of the supply chain to oversee and the number of opportunities for data integrity violations, companies that automate manufacturing and data management processes are able to gain tighter control of operations. The following are examples of how organizations can use technology solutions to confidently ensure data integrity compliance:

  • Flag data entry errors, enforce out-of-specification (OOS) nonconformance thresholds and launch corrective and preventive actions (CAPA) to resolve issues. (21 CFR 211.71(c))
  • Set up role-based authentications to prevent unauthorized access and potential data manipulation. Also, track user access and changes for audit trail purposes. (21 CFR 211.68(b))
  • Store data in a central repository that is secure, forward compatible, immediately accessible and prevents deterioration. (21 CFR 212.110(b))
  • Integrate critical business software systems such as enterprise resource management (ERP) manufacturing execution systems (MES) and laboratory information management systems (LIMS) to establish a single source of truth for all data. This helps ensure the integrity of data as it moves between systems. (21 CFR 211.68(c))
  • Organize and analyze large amounts of data to effectively identify trends, make operational decisions, eliminate duplicated efforts and waste, plan equipment usage and maintenance, etc.(4)

Digitization is critical for any type of organization. It augments data management best practices, enables stakeholders to have more visibility and control of the entire supply chain and allows companies to make faster and more confident strategic decisions. Most of all, it effectively unifies people, production equipment and systems.

 


 

References

  1. “FDA Threatens Criminal Action Against Novartis Over Faulty Data Used in Application for $2.1 Million Gene Therapy,” CNBC. Retrieved from https://www.cnbc.com/2019/08/06/fda-novartis-knew-its-application-for-2point1-million-gene-therapy-included-errors.html
  2. “Under the Spotlight: Data Integrity in Life Sciences,” Deloitte. Retrieved from https://www2.deloitte.com/content/dam/Deloitte/uk/Documents/life-sciences-health-care/deloitte-uk-data-integrity-report.pdf
  3. “Statement from FDA Commissioner Scott Gottlieb, M.D., and Director of FDA’s Center for Drug Evaluation and Research Janet Woodcock, M.D., on the FDA’s continuing efforts to maintain its strong oversight of generic drug quality issues domestically and abroad” Retrieved from https://www.fda.gov/news-events/press-announcements/statement-fda-commissioner-scott-gottlieb-md-and-director-fdas-center-drug-evaluation-and-research-0
  4. “Guidance for Industry: Computerized Systems Used in Clinical Investigations,” U.S. Department of Health and Human Services (HHS). Retrieved from https://www.fda.gov/media/70970/download



2019-bl-author-david-jensenDavid Jensen is a content marketing specialist at MasterControl, where he is responsible for researching and writing content for web pages, white papers, brochures, emails, blog posts, presentation materials and social media. He has over 25 years of experience producing instructional, marketing and public relations content for various technology-related industries and audiences. Jensen writes extensively about cybersecurity, data integrity, cloud computing and medical device manufacturing. He has published articles in various industry publications such as Medical Product Outsourcing (MPO) and Bio Utah. Jensen holds a bachelor’s degree in communications from Weber State University and a master’s degree in professional communication from Westminster College.