background image for GxP Lifeline
GxP Lifeline

Beyond Traditional Metrics: How Data Science Can Transform Quality Management Systems


Life science engineer working on predictive analytics.

Proactive organizations are the ones that win in today’s business environment, and quality control plays a huge part in that. Organizations with mandates to maintain and keep high quality standards know this well and, as a result, use electronic quality management systems (eQMS) as their main solution for training employees, managing processes, mitigating risks, and maintaining compliance. Quality systems collect vast quantities of data. The trouble is that, traditionally, this data has only been used reactively, or retrospectively, to identify and understand problems after they have already occurred. However, with the emergence of inductive analytical frameworks and tools in data science, the previously untapped potential of existing historical data sets can now be used proactively.

Given advancements in artificial intelligence (AI), machine learning (ML), and natural language processing (NLP), the utility of an eQMS can now be expanded to predicting outcomes, identifying risks, and driving continuous improvement. As integrations of digital quality management solutions with lab and manufacturing systems become more common, holistic data from enterprise-wide operations will continue to grow. Leaving the resulting data untouched means losing out on a critical business asset.

eQMS and Data Science

Understanding and defining an appropriate use case for predictive analytics in quality management is an essential first step in harnessing the power of AI, ML, and NLP, as the use case will dictate the data science technique and tool(s) that will provide desired business insights. Below are several example use cases where AI and ML can be effectively applied in eQMS to drive business value.

  1. Data Pulldown and Tracking via Robotic Process Automation (RPA)
  2. RPA can be used to pull data from disparate systems and/or physical documents into the eQMS, when there is an inability to integrate systems directly. This would eliminate the need for a manual population of that data and increase the likelihood that the eQMS could be the source of truth for all relevant data. An example here might be scanning physical documents that contain quality information which could then be identified, copied by the RPA, and pasted into the relevant fields within the eQMS.

    RPA allows for greater efficiency in simple and repeatable manual business processes by automating tasks previously performed manually. The software robots (bots) can mimic actions like entering data, generating reports, and processing transactions. Doing so in conjunction with a digital quality management solution can increase productivity and efficiency while reducing costs and errors.

  3. Root Cause Analysis (RCA) and Corrective Action/Preventive Action (CAPA) Creation
  4. AI-driven eQMS can facilitate automated RCA by analyzing data for patterns and correlations that can help identify the root causes of quality issues. It can examine instances of nonconformances, customer complaints, specification deviations, and recalls or returns and then use this information to generate CAPAs after classifying and prioritizing risks. For instance, a product may begin to fail quality control during the summertime, using historical data RCA can identify that temperatures during the summertime increase and that the increase in temperatures causes the product to fail quality control. AI-driven eQMS software can use the historical data to track temperatures, identify when the temperatures begin trending upwards in a certain storage room and/or refrigerator, and flag that for preventative action. AI helps to ensure past mistakes are not repeated, maintain quality of product and services, and satisfy customer needs.

  5. Monitoring of Statistical Process Controls (SPC)
  6. SPC plays an important role in digital quality management by providing a statistical approach to monitoring and controlling processes to ensure standards are met. Typically, SPC requires manual interpretation and analysis which is time consuming and costly. However, when combined with machine learning, SPC can be actively monitored based on algorithms tuned to proactively intervene before quality events occur.

    For instance, during manufacturing of a pharmaceutical tablet, part of the batch release process may be to test for the purity of the binder/inert ingredients. ML- or AI-driven SPC would allow for the measurement and evaluation of purity without human intervention if the process is performing within tolerance. If purity issues arise or even if they are simply trending in a concerning direction, AI-driven SPC would provide real-time insight and data on production performance without human intervention until it is needed. The resulting efficiencies will directly impact quality, reduce variability in processes, and ensure products meet quality specifications.

Implementation of Data Science Tools in QMS

The problem solved by data science must be realistic and directly applicable to your organization’s business needs, otherwise inappropriate tools may be used, which would not provide accurate, relevant, or useable results, analytics, and insights. It is also important to understand the limits to current trends in data science, such as AI; models are only as good as the data that they are trained on. Understanding statistical biases for predictive analytics in quality management is vital to the development of a useful predictive AI model.

Pitfalls like these must be avoided up front, prior to even beginning an implementation of a new data science application with a current eQMS. Here are the key steps of ensuring a successful implementation:

  1. Identify the problem area.
  2. Develop potential use cases for the problem area.
  3. Review available data with data science team to develop a set of potentially trainable and parameterizable use cases.
  4. Determine appropriate data science tool(s) to be used.
  5. Develop a minimum viable product (MVP) or prototype and make sure to define associated success criteria.
  6. Launch and evaluate MVP (should take place 8-12 weeks from determination of approved use case, data sets, and tools).
  7. Evaluate and adjust MVP, if necessary.
  8. Move to production.

Conclusion

The future of eQMS is about taking advantage of emerging technologies to generate value-added solutions to strategic business growth. In the age of analytics, organizations need to adapt to support digital transformation. This starts with making use of historical and current data and documentation in eQMS, for a pragmatically selected and articulated business use case.

West Monroe - Pankit Bhalodia

When it comes to driving growth and innovation in a changing marketplace, pharmaceutical and medical technology executives count on Pankit. His passion for teaming with others and using technology to streamline operations has created tangible impact for multiple industry leaders. For a pharmaceutical firm’s acquisition, he led post-merger integration of commercial operations and supply chain activities, helping the company improve EBITDA by 25%. And for a large consumer electronics company entering the medical technology market, he developed the new strategy and operating model, refined the product and portfolio strategy.

He advises pharmaceutical and medical technology clients on a range of challenges, including product and portfolio strategy, product development, operations, quality management system, merger and acquisition diligence, and integration. He also has a strong track record for guiding large transformational initiatives to value capture.


West Monroe - Adam Welsh

Scientific and domain technologies are leading transformation across the health and life sciences sector. Adam is at the center of this evolution. He has led multiple enterprise-wide solution initiatives, including the first-ever laboratory information management system (LIMS) for the U.S. Food and Drug Administration (FDA) — a five-year effort.

Adam has delivered value for large pharmaceutical companies, healthcare providers, health plans, and government agencies, including the Centers for Disease Control and Prevention, the Department of Defense, the FDA, the Administration for Children and Families, the Agency for Healthcare Research and Quality, the National Cancer Institute and the National Institute of Allergy and Infectious Disease. He has spent most of his career working with chief information/informatics officers, regulatory/compliance leaders, and chief medical officers in areas spanning laboratory systems to enterprise technology strategy.


Free Resource
The State of Digital Quality Maturity in Pharma and Medtech

Enjoying this blog? Learn More.

The State of Digital Quality Maturity in Pharma and Medtech

Download Now
[ { "key": "fid#1", "value": ["GxP Lifeline Blog"] } ]