For Blood & Biologics

Robin Nozick

The Value of the Quality Risk Management Approach to Validation As Prescribed by ISBT Validation Guidelines V2
by Robin Nozick, MT(ASCP), American Red Cross Headquarters, ISBT Technology Lead

Share This Article


According to the ISBT-Guidelines For Validation of Automated Systems in Blood Establishments (Guidelines)1 the blood bank is responsible for the regulatory compliance of the automated/computerized systems used at the Facility and MUST have a Quality Management System (QMS). Blood Banking organizations around the world support these guidelines, which prescribe that full validation of the computerized system be required for systems critical to product and quality (information management, storage, tools for operational decision-making, and control). The Quality Management approach to validation is a lifecycle approach (described within the QMS of the facility) and makes use of Risk Management policies to define the validation strategy for critical systems before any validation is begun. It is important that the approach to validation used by a blood establishment allows for the process to be scalable to the functionality of the system and the risk involved, e.g, the validation of a barcode reader is less complex than that for any blood management system.

Validation is part of the QMS

There are many benefits of good validation such as:

  • It improves your compliance with regulations;
  • It will ensure that the end user is competent to use technology;
  • There are many business benefits since the system will be in control;
  • We find the users, suppliers, and executive suite are generally happier with the project when they understand the control validation gives them;
  • It improves end users' efficiency since they will be trained and found competent;
  • It always reduces the risk of failure or ensures mitigation of risk as much as possible

What is Validation?

The guidelines explain that Validation is part of the QMS. The objective of validation is to establish documented evidence which provides a high level of assurance that a system will consistently produce a product meeting its predetermined specifications and quality attributes. Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ) classify the different validation tasks that have to be performed for ensuring the quality of the use of an automated system.

Each type of validation is defined in the Guidelines. IQ shows that the system has been installed correctly. Once IQ has begun, the system and infrastructure should be under formal change control. IQ considerations are:

  • Operating systems and application software
  • Barcode scanners
  • Servers
  • Workstations
  • Printers
  • Barcode printers
  • Databases
  • Environmental conditions (such as temperature, humidity)

OQ challenges the automated system and process operating parameters to ensure that they will result in a product that meets all defined user requirements under all anticipated conditions of manufacturing, including worst case testing. The User Requirements must have already been defined in order to perform OQ validation. These are usually built upon the regulatory agencies' regulations and standards for blood banking and the corporations' own business rules. Therefore this type of validation proves that the facility can do business and comply with the rules (regulations) for which the Transfusion Service is responsible (and mandated) to follow. In OQ, the following principles are followed:

  • The scripts written should verify Worst Case Scenarios (those that could kill or severely injure a patient)
  • Control Functions are challenged to ensure that the product will meet all defined user requirements
  • OQ proves that the system will make decisions correctly and allow the facility to comply with regulations
  • The system must be frozen (designed)
  • The riskiest parts of system are stressed more
  • The system is in strict change control for fixing and revalidation

PQ demonstrates that the computerized process will consistently produce acceptable product/output under normal operating conditions. The demonstration is achieved by using the appropriate methods and tools for process validation. PQ considerations are:

  • Using actual computerized parameters and procedures that will be performed during live operation;
  • Reconfirm acceptability of the computerized processes as established in OQ;
  • Make sure that processes that will be used repeatedly are stable when used in the field with trained operators

Challenges to the process should reproduce the conditions that will occur during the normal operation of the system. Challenges should include all situations covered by the standard operating procedures and should be repeated enough times to ensure that all personnel and processes function as intended and that personnel are competent to perform their assigned work on the new system.

Data migration validation is necessary because existing data that is transferred—either manually or electronically—from a source system to another system (usually from an old system to a new system) is critical data. There should always be tight management of the data migration process following a specific Data Migration Plan with finalized requirements. The content of the Data Migration Plan may vary depending on the complexity of the data migration processes. It must contain all of the processes, including all of the elements to be migrated, so the data conversion team can perform a successful migration. The plan should cover at a minimum: the scope; roles and responsibilities; requirements and deliverables; risk assessment; configuration management strategy; software tools and strategies for testing and validation so that the data can be viewed accurately in the new system; data mapping; data transformation rules; migration steps; data verification strategy and acceptance criteria; a system transition plan and a rollback strategy if the migration is not successful.

Why is Testing NOT Validation?

The Guidelines, as well as the FDA Regulations, are clear. They state that the testing of software is not by itself "validation"; it is a verification activity.2 It should be part of the overall validation of a process/system; however, it cannot take the place of a formal validation activity. Testing uses high levels of human and financial resources. The biggest difference between testing and validation is the time they are performed in the project and the state of the frozen system at the time.

In validation, the system is frozen and when issues are found they must be documented until the validation has been completed. Then, using Change Control—and simultaneously documenting every change—changes can be made to the validated system. Those changes must be evaluated by a team of experts to determine the impact on the system and then the cycle will begin once again.

Testing of the system must be completed before OQ Validation begins. I have heard of facilities that have spent hundreds of hours testing their blood bank module yet database errors are still discovered during the OQ validation. This forces organizations to fix the database and revalidate virtually right away. If they had tested more thoroughly, they would have found the database mistakes before the validation. Again, this is a time consuming, labor intensive and expensive process.


A question often posed by blood establishments is, "How much validation do we need to perform?"

Since validation should always be made a part of the facility's QMS, the question actually relates to the amount of quality needed. It is difficult to relate quality to be achieved by an organization through adopting the Total Quality Management (TQM) principles: customer satisfaction, employee involvement and continuous improvement to the cost benefits. The objective of validation is to produce documented evidence that provides a high level of assurance that all parts related to the use of an automated system will work correctly and consistently. The cost benefit is non-tangible and often we express this as the opposite: "What is the cost of NOT having a high level of assurance that the system will work as expected?"

The answer to the question, therefore, is that the blood establishment needs to ensure that enough validation of the correct type is done to achieve system acceptance in a way that satisfies the facility's own Quality Policy and Standard Operating Procedures.

What does make validation projects successful?

  • Senior management commitment
  • Tight project management
  • Sufficient competent resources to complete all the processes
  • A team approach, i.e, users/technical reps/validation/QA/IT/professionals
  • Risk management
  • Cost efficiency

Validation is a very complicated process. Many times the resources at the facility do not have the skills for this type of work. To complete the necessary testing and validation, the site may try a blended approach using elements of each type of validation (IQ, OQ, PQ) but not all of the recommended processes. Their hope is to reduce workload; however it has been my experience that this always causes more work and more confusing and disorganized documentation. This is why I don't recommend such an approach. A process that shows complete control is a must. When the Regulators arrive, the facility's greatest asset is organized, easy to follow, very thorough documentation, providing confidence that the facility is IN CONTROL.


1. Sampson, Janet, et al, "ISBT Guidelines for Validation of Automated Systems in Blood Establishments", Vox Sanguinis, Vol 98, Supplement 1, February, 2010

2. International Conference on Harmonization (ICH) - Guidance for Industry: Q9 Quality Risk Management, June 1, 2006

Robin Nozick currently ISBT Technology Lead with the American Red Cross Headquarters, was the founder of R.F. Nozick and Associates, the Leader in Blood Bank Solutions and I.S. Compliance. Robin spent the last 25 years serving the Clinical Laboratory and especially the Transfusion Services department, teaching and implementing, building, validating and testing, and training end users using all of the Customizable Blood Bank Systems available today. Robin sat on the AABB Information Systems Committee for six years, during which time she spoke about Validation, The ISBT Validation Guidelines Risk Management section, several times. In 2000 Robin joined the International Society for Blood Transfusion Working Party on Information Technology's Validation Task Force and helped write the first version of the "ISBT Guidelines for Validation and Maintaining the Validation State of Automated Systems in Blood Banking". She is presently the vice-chair of the Validation Task Force, who has recently approved for publication V2 of the guidelines, and has been challenged with responsibility for an Education Project that will create an e-learning tool that will train Transfusion Service personnel charged with the validation of their system to perform a quality validation. This work is geared towards the less developed countries that have few resources for information technology. The Education project is ongoing and anyone interested in working with the Task Force may contact Robin at

Other publications:

Nozick, RF, "CIO Perspective, Risk Assessment", AABB News, July/August 2002.

Nozick, RF, "Book Review: Information Technology in Transfusion Medicine", AABB News, September/October 2003, pg 42

Nozick, RF "Lessons Learned: Software Vendors' Different Approaches to User Validation", Citings Information Exchange, October 1, 2003, Volume XIII, no. 4,

Nozick, RF, "ISBT Guidelines for Validation and Maintaining the Validation State of Automated Systems in Blood Banking", AABB News, January/February 2004,

Share This Article

Watch Related Videos

Download Free Resources
White Paper: ISO 15189 Standards: For Medical and Clinical Laboratories
White Paper: ISO 17025: What it Means for You and Your Laboratory
White Paper: Automating Training Control Processes
White Paper: Change Control - Continuous Quality Improvement in FDA and ISO Environments
White Paper: Contracting Organizations and the Need for Written Transfer Obligations and Quality Agreements
White Paper: MasterControl: What's the Return on Investment?