background image for GxP Lifeline
GxP Lifeline

Top 4 Takeaways From Validation Week


Last month, I had the opportunity to attend the 25th Annual Validation Week in Las Vegas, Nevada. While this conference covered multiple types of validation, computer system validation (CSV) was the subject of most of the sessions I attended. And the conference did not disappoint, giving me four major trends that in most cases are already affecting validation.

#1: Regulations Are Trending to ‘Least Burdensome’

Validation Week featured presentations from current and former employees of the U.S. Food and Drug Administration (FDA). While not explicitly stated, the impression from their sessions was that the FDA wants to work with and help life sciences companies. They’re not trying to “get” you. In fact, some of the most recent guidance would make life considerably easier for life sciences companies. At least, if they’re willing to leave behind a “that’s how we’ve always done it” mentality. The Center for Devices and Radiological Health (CDRH) lists a “least burdensome” approach among their 2018-2020 strategic priorities. Some of the burdens they hope to remove include “cumbersome processes, vague policies, and out of date information technology systems.”

Another example of removing regulatory burdens was cited in “Explore the Convergence of Regulatory Trends Impacting Quality Management,” presented by Kimberly A. Trautman. She spoke about the convergence of 21 CFR Part 820 and ISO 13485 as an extension of the success of the Medical Device Single Audit Program (MDSAP). In both cases, the FDA is trying to simplify international compliance for medical device companies. Harmonizing the regulation with the standard will reduce the regulatory burden for companies trying to meet requirements of both.

#2: Computer Software Assurance (CSA)

Another way that the FDA is working to reduce the regulatory burden is through CSA. We’re still waiting on the agency to release the promised guidance, but they have hosted a webinar on the topic.

That webinar and this conference made it clear that companies don’t have to wait for the guidance to start adopting a CSA approach. In brief, CSA is a paradigm shift from document-focused validation practices to critical-thinking assurance. The FDA doesn’t want companies creating documentation solely because of the FDA. The documentation should serve a purpose. And not all testing demands the same level of documentation. Deciding what needs the most attention is dependent on risk.

In a session entitled “Arrival of a New Era in Validation,” Praveen Kalluri spoke about calculating an overall risk score to decide how much testing and documentation a feature needs. When looking at the impact that a feature will have, the FDA is most concerned with whether it’ll affect quality and/or patient safety. However, Kalluri noted that risk is not just a matter of impact. Probability and detectability influence the overall risk. So, if an event would have a high impact, but isn’t probable and would easily be detected, the risk could still be considered low.

If CSA sounds familiar, it should. We covered it in GxP here and mentioned it here. It was discussed so often that in the above-mentioned session, Kalluri commented “we’ve beaten this topic to death.”

#3: Automation

Lessening the burden of your validation process through CSA requires lessening the burden brought on by paper. Unfortunately, as a show of hands at the conference demonstrated, there are still plenty of companies hanging on to paper. Multiple sessions talked about the necessity of automation and digitization. Human error isn’t eliminated by automation, but it is considerably lessened. And whether you work in a regulated industry or not, replacing manual data entry with automation provides greater data integrity.

One of the biggest benefits to automation is that employees have time to do work that requires thinking. Manually entering data or capturing screenshots takes a lot of time in the validation process and is something that can and should be done automatically. Companies don’t have to completely rehaul their processes simultaneously. In fact, small automation can increase efficiency and remove barriers to further digitize an organization.

#4: Time to Change

Admittedly, the whole point of going to conferences is to learn about trends and insights that you can take back to your organization. However, making changes in highly regulated industries has more roadblocks than in other industries. Getting over these hurdles often means that nothing changes at all. This is a problem because in some cases, we still do things the way we were doing them 30 years ago. In his joint presentation with Senthil Gurumoorthi, “Pragmatic CSV — Risk-Based Testing and Documentation,” Ken Shitamoto pointed out that we validate the same way we were validating in the ‘90s.

Even at this conference, it was clear that some attendees were uncomfortable with the idea of changing their validation practices to decrease their documentation. One of them pointed out that since they know the way they’ve been doing things works, he’ll have trouble convincing his company to do something else. However, practices that are time-consuming and keep your computing practices firmly rooted in on-premise solutions aren’t “working” at all. The future is digital, automated and in the cloud. Companies can’t get there if they don’t change how they validate.

Conclusion

Seeing the next big thing in validation was exciting, but it’ll be even more exciting to see these implemented on a large scale. If companies work with the FDA, which is what the agency wants, they can reduce their documentation burden and the amount of time needed for validation. Once they do that, they’ll be better positioned to use new technology and automate their processes. The approaching changes to the regulations surrounding validation are meant to make things easier, but they’ll only do that if companies update their processes and software. Everything companies need to get started already exists, so now is the time to hop on board the bandwagon.


2019-bl-author-sarah-beale

Sarah Beale is a content marketing specialist at MasterControl in Salt Lake City, where she writes white papers, web pages, and is a frequent contributor to the company’s blog, GxP Lifeline. Beale has been writing about the life sciences and health care for over five years. Prior to joining MasterControl she worked for a nutraceutical company in Salt Lake City and before that she worked for a third-party health care administrator in Chicago. She has a bachelor’s degree in English from Brigham Young University and a master’s degree in business administration from DeVry University.


Free Resource
MasterControl Validation Excellence™ Solution Overview

Enjoying this blog? Learn More.

MasterControl Validation Excellence

Download Now
[ { "key": "fid#1", "value": ["GxP Lifeline Blog"] } ]