7 Key Elements to Data Security and Quality Control for Pharma Labs

7 Key Elements to Data Security and Quality Control | MasterControl In recent years, several current good manufacturing practice (CGMP) violations involving data integrity have been observed by the U.S. Food and Drug Administration (FDA) during inspections. Data integrity in the drug industry is a vital element to ensure the safety, efficacy and quality of drugs.

The purpose of this article is to introduce key elements of data management and security for data generated by good manufacturing practice (GMP)/good laboratory practice (GLP) instruments used in the drug industry. With an abundant number of laboratory and manufacturing instrumentation available to the industry, there is no one data management/security solution to accommodate all applications.

Various regulatory agencies have established guidelines for data management and security. Below is a list of agencies and the corresponding requirements:

  • Medicines and Healthcare Products Regulatory Agency (MHRA) GMP Data Integrity Definitions and Guidance for Industry, March 2015. Revised draft GXP Data Integrity Definitions and Guidance for Industry, July 2016.
  • Pharmaceutical Inspection Convention and Pharmaceutical Inspection Cooperation Scheme (PIC/S) Draft Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments, August 2016.
  • World Health Organization (WHO) TRS 996 Annex 5 - Guidance on good data and record management practices, 2016.
  • U.S. Food and Drug Administration (FDA) Guidance - Data Integrity and Compliance with Drug cGMP, 2018.

Data Management Policies

The MHRA stated in their guidance that “The data governance should be integral to the pharmaceutical quality system.” The approach to managing data should be commensurate with risks to patient safety and product quality in the event of a data integrity lapse. In order to manage data effectively, companies will need to develop and implement policies, procedures, training curricula and validation guidelines surrounding data management and security.

    Greater emphasis on #datasecurity in #pharma is a result of several CGMP violations in recent years observed by the FDA, says @MCMasterControl http://bit.ly/2W0htOL

  • Policies should guide companies to consider instrument system applications with basic data integrity features such as unique user accounts and passwords, protection from overwriting/modifying/deleting records, user level security groups, inactivity timeouts, password expiration intervals, user lockout mechanisms, electronic signatures (as applicable) and audit trails. Management of these features should be included in the policies and standard operating procedures. In systems that lack the security features mentioned above, policies and procedures are essential to maintain a compliant state.
  • Record retention policies must be in place to govern the life of both paper and electronic data. The procedures for each instrument system should dictate whether data can be retained in the original system or in an appropriate archive, ensuring that records are protected from deliberate or inadvertent modification or loss. Backup systems for electronic data should make it possible to create a true copy of data, including relevant metadata and audit trail for review. It is important to keep in mind that “paper-based and electronic data record-keeping systems are subject to the same requirements” according to the FDA’s recommendations.
  • Procedures should govern the collection and reporting of electronic data by data management systems. It should be conceivable that automated reports will reduce the effort to review data in order to ensure data integrity. Data collection and reporting systems must be configured to allow for the reconstruction of data generated.
  • Procedures must exist to review data periodically in order to prevent and detect data integrity lapses and also to verify the accuracy, completeness and truthfulness of data and metadata. Routine data review should include a documented audit trail review. Frequency for review of audit trails should be determined based on data criticality and risk assessment.
  • Policies should be in place to train employees on the importance of data integrity principles and identification of data integrity lapses. Companies should create a working environment that encourages reporting errors or aberrations through the quality system.
  • white paper icon

    Enjoying this article? You may also enjoy this White Paper:

    6 Big Data Concepts Every Life Sciences Executive Needs to Understand

    Download Free White Paper
  • Validation procedures and guidelines should exist in order in order to ensure that data generated from the instrument systems are secure and maintained in accordance with company and industry guidelines. If the same system is used to perform both CGMP and non-CGMP functions, procedures should guide mitigation of any associated risks.
  • Company policies should address data management throughout the life cycle and data process flowcharts should be established to track the flow of data in order to comply with the principles of data integrity.

Data Security Implementation

Data storage locations must be secured to prevent data from being saved to unauthorized file storage locations, including removable devices. One way of securing data is to implement local security groups or active directory groups to the data storage folder where only users in certain security groups have permissions to access the folder. It is good data integrity practice for system administrators to remove, delete and modification permissions from folders containing original CGMP/GLP raw data. In certain software applications, the removal of delete and modification permissions through the operating system will prevent the application from saving data to the folder. In such cases, third party software applications can provide solutions for restricting access to the folder without modifying Windows NTFS permissions.

Audit trails are a key element to permit detection and prevent manipulation of records. Sound data integrity practice involves retaining audit trail and all relevant metadata that support GMP/GLP processes and data reporting. Ideally, software applications for GMP and GLP environments should be 21 CFR Part 11 compliant as well as Annex 11 compliant.

Audit trails are a key element to detection and helping prevent the manipulation of #pharma data records and for sound #dataintegrity, says @MCMasterControl http://bit.ly/2W0htOL

The following examples will go over common and unique situations for handling data storage and backup/archiving for data retention.  

1. Standalone Systems

Standalone systems generally are instruments that either have built-in firmware or computer workstations which are not connected to a network. Some of the advantages of implementing standalone systems is they are inherently protected from network hacking or intrusion. For such systems, data is stored in the instrument firmware or the local hard drive of the computer workstation.

To support the backup of data from the local hard drive, there are numerous products on the market that can create full system backup images of the local hard drive. Storage devices like USB devices or portable hard drives can be used to migrate data from the firmware to a computer workstation and then be backed up by tools mentioned above.

2. Network Servers

A network server can be used as a central data repository system for instrument workstations that are networked. In a network server, data migrates from the application installed on a local workstation to secure directories in the server. It is critical to ensure that the network server is secured with appropriate  access controls that are managed by company policies.  A secondary backup of the network server should be in place. A risk assessment should be implemented to assess the frequency of backups.

3. Networked Database Server

A networked database server is a database management system (DBMS) which controls access to data, defines data types and allows searching of information and computing derived information. Applicable software applications are designed to store proprietary and non-proprietary data using database management languages. DBMS systems provide framework for enforcement of data privacy and security. Multiple systems can store data to one database or multiple instances of the database on the same server. This solution is ideal when storing vast amount of data and effectively helps end users share data fast and efficiently. Administration of database servers it vital to the security of the data; authorized administrators should manage these databases. It is essential to back up the primary database for the  system. The database backup will duplicate or copy the database instance. This will ensure a backup solution in case of primary database crash, corruption or loss. The backed-up instance can be restored on the database server with appropriate company procedures and guidelines .

4. Backup of a Networked PC or Local Database

In cases of application software which only have the capability to store data to the local hard drive or local databases, solutions can be implemented to provide migration solutions. If using VB scripts, Windows task scheduler can be utilized to automatically execute the scripts to move data from the local hard drive to the network server to backup data. It is good practice to implement secondary backup solutions for all GMP/GLP data.



sequence-author-1-headshotArmando Coronado is a graduate of the University of Florida with a bachelor’s degree in microbiology and cell science. He began his career in the pharmaceutical industry in 2005 when he joined Talecris Biotherapeutics, now formally known as Grifols. During his time at Talecris/Grifols, Coronado supported R&D and assay support groups for in-process product manufacturing of plasma derived products. In 2008, he transitioned to Cirrus Pharmaceuticals, where he honed his skills in analytical instrumentation, method development and method validation in a GMP environment. Coronado brought his talents to the validation industry by joining the Sequence team in 2011.  He is currently working in the laboratory compliance division at Sequence.  Most recently, he managed the commissioning, qualification and implantation of a gene therapeutic laboratory.  The laboratory validation used a risk-based approach to implement best practice applications to satisfy data integrity and CGMP requirements. He is a subject matter expert in regulatory compliance in the pharmaceutical industry focusing on computer system validations, data integrity, data management, quality assessment and validation management. Coronado is well versed in the implementation of 21 CFR Part 11 and Annex 11 required systems. Sequence is a Referral Partner of MasterControl.

sequence-author-2-headshotVidhya Ranganathan is a senior consultant & team lead at Sequence, and has been with the company since 2011. She has a wealth of experience working with pharmaceutical and biotech clients, helping them implement new instruments and equipment in a compliant manner, with a focus in data integrity. Using a risk-based approach to quality control and compliance, she has successfully delivered solutions that weave quality and data integrity into business processes.

Whether working on implementation of new systems or on critical quality components (CAPA, audits, etc.), Ranganathan provides clients with support and guidance to maintain compliance, considering each client’s business need. She has played an active role in the generation and review of standard operating procedures and technical documentation in support of validation and remediation for data integrity. Her direct experience includes facilitating data process flowcharts, process risk assessments, change management, instrument and equipment validation plans and protocols. Ranganathan holds a bachelor’s degree in biotechnology and is a member of the American Society of Quality Professionals (ASQ). Sequence is a Referral Partner of MasterControl.