background image for GxP Lifeline
GxP Lifeline

3 Key Elements to Data Security and Quality Control for Pharma Labs


engineer-showing-automation-system-715

Data integrity in the drug industry is a vital element to ensure the safety, efficacy, and quality of drugs. Data integrity has remained an industry buzzword for many years, but the experience the industry has gained from shortened timelines of COVID-19 vaccine development and manufacturing has further strengthened its importance.

The improvements in manufacturing efficiency can be directly correlated to the advances in technology that support those processes. Manufacturers who previously were considering partial digitalization of processes are now going full speed ahead with full-scale digitalization efforts in the areas of manufacturing execution systems (MES), quality management systems (QMS), laboratory information management (LIMS), and electronic batch record (EBR) management. The release of the 2020 International Society for Pharmaceutical Engineering’s (ISPE) Good Practice Guide: Data Integrity by Design highlights that building data integrity into the core of any process will ensure that quality and efficiency will be an output. This will naturally occur as more critical thinking will be required during the planning and requirements phase of any digitalization effort.

There are an abundant number of laboratory and manufacturing instrumentation available to the industry, but there is no one data management/security solution to accommodate all applications. No matter what the selected solution may be, key elements of data integrity must be prioritized.

Various regulatory agencies and industry consortiums have established guidelines for data management and security. Below is a list of data integrity-relevant agency and industry-related references:

  • World Health Organization (WHO) TRS 996 Annex 5 - Guidance on good data and record management practices, 2016.
  • ISPE GAMP Records and Data Integrity, 2017
  • Medicines and Healthcare Products Regulatory Agency (MHRA) ‘GXP’ Data Integrity Guidance and Definitions, March 2018.
  • U.S. Food and Drug Administration (FDA) Guidance - Data Integrity and Compliance with Drug cGMP, 2018.
  • ISPE GAMP RDI Good Practice Guide: Data Integrity by Design, November 2020
  • Pharmaceutical Inspection Co-operation Scheme (PIC/S) Guidance – Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments, July 2021.

Although regulations and guidelines are in place, the formula to ensure data integrity remains through people, processes, and technology. These key tenets must be aligned through training and validation. The behavioral controls to create a culture of quality can be implemented in many fashions that coincide with other motivational workplace programs. This article highlights the data management and technology aspects of ensuring data integrity.

1. Data Integrity Defined

The most recent PIC/S Good Practices for Data Management guidance expands on the the definition of data integrity from the 2018 MHRA definition:

Data Integrity is defined as “the degree to which data are complete, consistent, accurate, trustworthy, and reliable and that these characteristics of the data are maintained throughout the data life cycle.”

The data should be collected and maintained in a secure manner, so that they are attributable, legible, contemporaneously recorded, original (or a true copy), and accurate (ALCOA). Assuring data integrity requires appropriate quality and risk management systems, including adherence to sound scientific principles and good documentation practices..

Additionally, data management was expanded to bolster the importance of ALCOA+ principles as shown below:

Data management refers to all those activities performed during the handling of data including but not limited to data policy, documentation, quality and security. Good data management practices influence the quality of all data generated and recorded by a manufacturer. These practices should ensure that data is attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available. While the main focus of this document is in relation to GMP/GDP expectations, the principles herein should also be considered in the wider context of good data management such as data included in the registration dossier based on which API and drug product control strategies and specifications are set.

2. Data Management Policies

The MHRA stated in its guidance that, “The data governance should be integral to the pharmaceutical quality system.” The approach to managing data should be commensurate with risks to patient safety and product quality in the event of a data integrity lapse. In order to manage data effectively, companies will need to develop and implement policies, procedures, training curricula, and validation guidelines surrounding data management and security.

  • Policies should guide companies to consider instruments with applications that have basic data integrity features such as unique user accounts and passwords, protection from overwriting/modifying/deleting records, user level security groups, inactivity timeouts, password expiration intervals, user lockout mechanisms, electronic signatures (as applicable), and audit trails. Management of these features should be included in the policies and standard operating procedures. In systems that lack the security features mentioned above, policies and procedures are essential to maintain a compliant state. Policies should also ensure the separation of duties between users and administrators.
  • Record retention policies must be in place to govern the life of both paper and electronic data. The procedures for each instrument system should dictate whether data can be retained in the original system or in an appropriate archive, ensuring that records are protected from deliberate or inadvertent modification or loss. Backup systems for electronic data should make it possible to create a true copy of data, including relevant metadata and audit trail for review. It is important to keep in mind that “paper-based and electronic data record-keeping systems are subject to the same requirements” according to the FDA’s recommendations.
  • Procedures should govern the collection and reporting of electronic data by data management systems. It should be conceivable that automated reports will reduce the effort to review data in order to ensure data integrity. Data collection and reporting systems must be configured to allow for the reconstruction of data generated.
  • Procedures must exist to review data periodically in order to prevent and detect data integrity lapses and also to verify the accuracy, completeness, and truthfulness of data and metadata. Routine data review should include a documented audit trail review. Frequency for review of audit trails should be determined based on data criticality and risk assessment.
  • Policies should be in place to train employees on the importance of data integrity principles and identification of data integrity lapses. Companies should create a working environment that encourages reporting errors or aberrations through the quality system.
  • Validation procedures and guidelines should exist in order to ensure that data generated from the instrument systems are secure and maintained in accordance with company and industry guidelines. If the same system is used to perform both CGMP and non-CGMP functions, procedures should guide mitigation of any associated risks.
  • Company policies should address data management throughout the life cycle and data process flowcharts should be established to track the flow of data in order to comply with the principles of data integrity.

3. Data Security Implementation

Data storage locations must be secured to prevent data from being saved to unauthorized file storage locations, including removable devices. One way of securing data is to implement local security groups or active directory groups to the data storage folder where only users in certain security groups have permissions to access the folder. It is good data integrity practice for system administrators to remove or delete modification permissions from folders containing original CGMP/GLP raw data. In certain software applications, the removal of delete and modification permissions through the operating system will prevent the application from saving data to the folder. In such cases, third-party software applications or windows scripting can provide solutions for restricting access to the folder without modifying Windows New Technology File System (NTFS) permissions.

Audit trails are a key element to permit detection and prevent manipulation of records. Sound data integrity practice involves retaining audit trail and all relevant metadata that support GMP/GLP processes and data reporting. Ideally, software applications for GMP and GLP environments should be 21 CFR Part 11 compliant as well as Annex 11 compliant.

The following examples explain common and unique situations for handling data storage and backup/archiving for data retention.

Standalone Systems

Standalone systems generally are instruments that either have built-in firmware or computer workstations which are not connected to a network. Some of the advantages of implementing standalone systems is they are inherently protected from network hacking or intrusion. For such systems, data is stored in the instrument firmware or the local hard drive of the computer workstation.

To support the backup of data from the local hard drive, there are numerous products on the market that can create full system backup images of the local hard drive. For example, data can be transferred from a non-networked PC to a networked pc utilizing a bridged USB cable or crossover network cable.

Network Servers

A network server can be used as a central data repository system for instrument workstations that are networked. In a network server, data is directly saved from the application installed on a local workstation to secure directories in the server environment. It is critical to ensure that the network server is secured with appropriate access controls that are managed by company policies. A secondary backup of the network server should be in place. A risk assessment should be implemented to assess the frequency of backups.

If using a cloud storage solution, service level agreements (SLAs) and verification of Service Organization Controls (SOC) 1/2 certifications must be verified to ensure proper controls are in place. Additional controls for data security can be followed by researching the recommendations from Center for Internet Security (CIS) and its tools for benchmarks, controls, and standards.

Networked Database Server

A networked database server is a database management system (DBMS), which controls access to data, defines data types and allows searching of information and computing-derived information. Applicable software applications are designed to store proprietary and non-proprietary data using database management languages. DBMS systems provide framework for enforcement of data privacy and security. Multiple systems can store data to one database or multiple instances of the database on the same server. This solution is ideal when storing vast amount of data and effectively helps end users share data fast and efficiently. Administration of database servers it vital to the security of the data; authorized administrators should manage these databases. It is essential to back up the primary database for the system. The database backup will duplicate or copy the database instance. This will ensure a backup solution in case of primary database crash, corruption or loss. The backed-up instance can be restored on the database server with appropriate company procedures and guidelines.

Backup of a Networked PC or Local Database

In cases of application software that only have the capability to store data to the local hard drive or local databases, solutions can be implemented to provide migration solutions. If using Visual Basic Scripts (VBScripts), Windows Task Scheduler can be utilized to automatically execute the scripts to move data from the local hard drive to the network server to back up data. It is good practice to implement secondary backup solutions for all GMP/GLP data.


Samir Patel

Samir Patel has over 20 years of multifaceted experience in the life sciences industry that includes leading consulting projects that range from recent CSA Implementation initiatives,  IT-based data integrity program for lab systems operations to implementation and validation of numerous lab systems and enterprise applications. As one of the founding members of Sequence, he has helped transform the company from a four-person validation team into a multi-services company with over 200 consultants serving companies globally. Within his current role, Samir is largely accountable for leading projects for digital systems and partnership services development.


Free Resource
MasterControl QMS Overview

Enjoying this blog? Learn More.

MasterControl QMS Overview

Download Now
[ { "key": "fid#1", "value": ["GxP Lifeline Blog"] } ]