Both U.S. and European mass serialization regulations require an integrated flow of information across the end-to-end supply chain, with related technology components for labelling, data capture, control and reporting. Irrespective of the regulations and the business drivers for a safe and secure supply chain for the life sciences, there is one thing that none can dispute – this is not an easy task!
In common with other industries (electronics and automotive, in particular), the pharmaceutical industry now includes a multitude of players across an increasingly outsourced network of transformation and distribution partners. As such, when defining strategies and approaches for something as important as an information system to share data at the product and transaction level, it is critical to take a holistic view. Although current mandates and programs are initiated at the finished product manufacturing point, it is prudent to consider future needs for a more granular perspective, to include visibility to product ingredients and components and potentially final consumption and/or return and destruction. (An important point as the FDA has emphasized the value of serialization for product recall).
In my previous writings on the subject, I suggested that a cross-functional team should address these challenges and issues, developing detailed process maps and overviews to identify check and choke points. These should be reviewed for potential hazards and development of proactive remediation strategies. This is still the best way to review each of the different supply chain models and evaluate the risks and opportunities across the flow of materials, flow of information, and the related change of ownership across the cash-to-cash cycle.
The requirements for compliance with the DSCSA (and related EU Falsified Medicine Directive) are extensive. The best approach is to establish a dedicated project team, including cross-functional representation for consultation and ensuring that all key stakeholders are part of the requirements definition and review process. Collaborating with all participants across the product lifecycle will enable a more holistic view of data sources, as well as users of data that could facilitate process transformation. Entering a mass serialization program is costly, both in terms of cash and resources. Careful consideration of both direct as well as indirect benefits of such a program will facilitate a richer return on investments (ROI) proposition.
Entities – internal as well as external – that could be included in identifying issues, opportunities and potential constraints include (but are not limited to):
External supply chain participants
Any initiative for process improvement and data sharing requires a well-defined foundation – the Current State (As Is) baseline.
Time is well spent on identifying each of the participants, activities performed, transactional and event driven data, as well as data sources (manual or automated). Elapsed time between events should be noted (and verified). These activities should take place across both specific origin/destination pairs – or trade lanes – and product-specific process mapping. Many life sciences products have special handling requirements, for example, temperature, altitude and vibration control. This is an opportunity to include this level of detail in a product and trade lane outline. As in any program, the AS IS baseline should be evaluated to identify non-value added’ activities, creating a streamlined flow that should be incorporated into the Future State (To Be) blueprint to facilitate product and item level track, trace and authentication, as well as exception-based alerts. Look for areas of common ground – information needs of key partners in the supply chain process, for example, transportation providers, who are both sources and users of data and information.
Barcodes, radio-frequency identification (RFID) and other auto-identification technologies have been in existence for decades. Renewed interest in RFID as part of mandates on behalf of discount retailer chain Walmart and the U.S. Department of Defense in the early 2000s has spawned many enhancements that include development of nanotechnologies that provide enablers to capture location, state and temperature at the item, carton and transportation device level. This is good news when developing a strategy for which data carrier(s) should be incorporated. It is equally important to consider data sources (for both static and dynamic data capture) timing and data capture tools that are used in a distribution and storage environment (for example, RFID and bar code readers). And then of course there is the requirement for a shared data repository as well as agreed global standards (GS1 are spearheading this initiative). This is a critical element, especially for a global solution, and provides a reference model for both authentication (through the Standard Numerical Identifier [SNI] trading partner verification) as well as providing a database to capture and store events and related activities.
In addition to internal applications that are in place across the chain of custody – and are primary sources of transaction level data – there is a requirement for additional software components to facilitate authentication and traceability. The term “orchestration platform” is commonly used to describe applications that integrate, store and distribute data across an extended network of inter-connected entities. There is a growing list of providers, some of whom have evolved from the initial list of players when the California e-pedigree initiative was a hot topic in 2008! Ten years later the ubiquity of wireless networks and the omnipresent “Internet of Things” (IoT) has enhanced the capabilities available to create an audit trail of digital DNA!
Finally, it is critical to consider the point within the supply process that the “digital flow of data” should be initiated – and at which stage in the manufacturers’ process the Global Trade Identification Number (GTIN, subject to the global standards of GS1) and the data carrier should be attached. There are several options:
Having defined the participants, the process, data capture and hand off points, and technology enablers, the next step is to drill down to the physical and data flow at the conceptual level. Using simulation software – or a simple representation model (GS1 have developed a reference model) – master data should be categorized into product, customer and other partnership relationships. The association between this master data and the transactional data generated through the product lifecycle (for example, packing list or invoice) provides the baseline for product authentication and trading partner verification. Incremental data, captured through aggregation (association between shipment, carton, transportation unit) and inference (link between location of associated unit and item) enables a digital view based on transactions. Additional data points can be captured based on discrete events independent of transactional data generated by existing applications. This should facilitate an ongoing and real-time view of what is happening, or alternatively, using event management tools to identify what has failed to take place!
This detailed operational and process review should identify additional areas of opportunity. For example, how to capture temperature variations or that elapsed time between key events (planned versus actual) should assist in refining the data and implementation models.
One of the biggest challenges faced when embarking on a large scale and multi-faceted initiative is getting beyond the concept and into the reality. Several companies have looked to their transportation service providers to assist them in taking the first steps from the white board and into the field. In many cases, carriers (integrated carriers, in particular), have relationships with multiple players across the supply chain. In fact, when developing a strategy, one approach is to review the third-party logistics (3PL) base to identify partners that can provide support to facilitate data capture from manufacturing, through distribution, and all the way to point of dispensing. Enjoy the journey!
Read Part 1 of this blog post here.
Carla Reed is a supply chain professional with more than 20 years of experience in supplier engagement, manufacturing, emerging market development, outsourcing, global trade, regulatory compliance, storage and distribution. She is a thought leader in supply chain transformation, and has researched, authored and presented a series of white papers related to opportunities and risks across the discovery to distribution lifecycle for life sciences. Her understanding and expertise in material acquisition, management and transformation has been disseminated in white papers, speaking engagements and more recently, in co-authoring the ISPE Operations Guidelines for Pharmaceutical Manufacturing. Her firm, New Creed LLC, provides change leadership to facilitate sustainable solutions, providing hands-on experience in all aspects of supply chain operations. Learn more about Reed at https://www.linkedin.com/in/carla-reed-2b8830/.