Tuesday, November 27, 2018

A New Normal...


For many of us who have worked in the pharma serialization space for many years today represents a unique milestone- for the first time the US market will be operating under a "live" serialization regulation.   

I put "live" in quotations for the simple fact that the industry is largely unaware of how the FDA and other governance bodies are going to enforce the regulation going forward.   Just last week I received great input from a 30+ year FDA auditor who noted the agency tends to have a policy of "educate while you regulate".   Interpret that as you may.

I likely take an unpopular viewpoint which is I hope the FDA enforces DSCSA to the fullest-swiftly and forcefully.  While I don't fully expect this to be the reality, I firmly believe that after a decade+ of regulatory delays, including the most recent year delay of the DSCSA serialization deadline, the FDA needs to give some indication that serialization is the 'new normal' in the pharma industry.   Conversely, I think it’s time for the industry to be held responsible (or maybe better said- liable) for the efforts put forth and decisions made about how to best meet these regulations.   

And so, waking up this morning I didn’t quite know what to expect- the chatter on LinkedIn seems to be fairly minimal in fact.  For most in the industry knowing if they would be in a compliant position today was something determined long ago- so it makes sense that today is largely a non-event.  Undoubtedly there will be packaging runs done today/tomorrow/...  which are not serialized.  It's not out of pure ignorance for the regulation but likely folks who simply couldn't get batches done in time and are willing to take the risk the FDA will show leniency.  

Certainly, the true measure of how well the industry has achieved serialization compliance will be measured from this point forward.   Eventually the FDA can't turn a blind eye to unserialized batches and, moreover, if the FDA does want to make an example of some manufacturers, they won't have to look too hard for findings.    For example, we know the state of DSCSA-compliant barcoding is still woefully bad based on the most recent GS1/Big 3 surveys.   And I have often stated, and still happy to be viewed in the minority with this opinion, that the data being gathered by companies in their serialization solutions is woefully inadequate- and what we will soon find out- is if in some cases will also lead to compliance issues under DSCSA.   Of course, going back to my opening point, this is only a problem if the FDA decides to enforce and, moreover, knows how to interpret the data. (e.g. if a tree falls in a forest and no one is around to hear it, does it make a sound?)   

So, I'll toss an easy one out there for the FDA to look for and for manufacturers to check in their own systems in case they are concerned someone will come looking.     

Look at the instances in your serialization data where you indicate a 'change of ownership' has occurred.  This is often veiled under the labels "From Business" and "To Business" but make no mistake, the underlying data shows a change from a source owning party to a destination owning party.    Of course, we all know under DSCSA that 'change of ownership' also requires the existence of a T3 document- which for years now has tracked the sale of items from seller to buyer at a lot level (and still is required even once your product is serialized).   So, the simple question becomes- do you have, or can retrieve, a T3 document for every instance where your serialization data is showing a 'change of ownership' occurred?  I'm quite intimately aware that many manufacturers out there would not be able to produce corresponding T3s if they paid attention to all of the occurrences where their serialization data claims a change of ownership- and the reason for this is because their data will erroneously claim a change of ownership when one never actually happened, and thus no T3 will ever exist.

So now If I'm the FDA and I want to give my auditors an 'easy' intro question to interrogating a manufacturer's serialization data it wouldn’t be far-fetched for them to say, “Provide all serialization data for product X or Lot Y and all corresponding DSCSA T3s where you show a change of ownership has occurred."  In that one question an auditor would get a very clear idea of where a manufacturer stands with both the new serialization requirements as well as the long-standing Lot Level T3 requirements.

Why is this important?   As the industry now moves past this big milestone of DSCSA which largely centered around the ability to physically apply a unique identifier to items, attention will increasingly turn to data- as data is the single most important component to ensuring compliance with all future DSCSA milestones- whether it be product verifications, saleable returns or eventually full supply chain serialized data exchange. The example above is not likely to stop product supply, but it does represent the importance of having consistent and accurate data (and certainly most manufacturers will want to avoid FDA findings whenever possible)

If you are a manufacturer and happen to check your data and find yourself in an unfavorable position- I will say it’s likely not entirely your fault.   The data scenario described above is most often seen due to limitations in enterprise serialization platforms- whether it be the one you are using or even the one a partner, like your CMO, is using.  Said another way- its more than likely your vendor made a data decision without you even knowing.  Up until yesterday that didnt cause any issues.  Today and going forward you are now solely liable.

You don't need to know every nitty-gritty data detail but, as of today, you are now responsible for what that data says.  The only interpretation of DSCSA that matters is your own- not your vendors.  And starting today, a system limitation is certainly no excuse for being put in an unfavorable compliance position. And thus, my longstanding mantra remains-   maintain oversight of your serialization/traceability programs and leverage the readily available (and often free) tools and services to help ensure your serialization implementation is on track (and now- meets DSCSA compliance).


Wednesday, October 10, 2018

Finding the proper ‘center’ of your master data management capability


When frequently asked for my top ‘lessons learned’ from the dozens of serialization deployments I have supported my answer is always the same:
  • Serialization is not a ‘project’ because it’s not something that ends.  When done properly serialization must be structured so that it becomes ingrained in your everyday business operations
  • Initial serialization deployments always take longer than people expect them to
  • Serialization will expose the need for more formal master data management- almost always to the point it becomes its own program/project
This article focuses on item #3 and attempts to provide some critical points as companies recognize the need for more formal master data management.  Additionally, I’ll highlight some misconceptions regarding vendor offerings in the master data management space.

To start- a quick mention as to the intended audience of this post.   Those at big pharma/biotech likely already have formal master data strategies, systems and processes not to mention entire teams dedicated to the practice.    This article is geared towards the small to medium pharma/biotech who often have no formal master data management prior to their serialization journey.  Many of these companies have recognized (or will soon) the need for such capabilities- even companies that have just a single SKU.
 
Recognizing the need for formal master data management often comes as a result of feeling the pain from:
  • An increase in the number of product SKUs and/or locations a company must manage (the obvious one)
  • An increase in the number or geographic disbursement of internal teams which require access to master data.
  • An increase in the frequency which master data must be shared with partners/ customers, industry groups and government agencies
The reason serialization is so often the catalyst for further master data efforts is because it touches on at least 2, if not all 3, of the items above.
  • Serialization can be directly attributed to an increase of SKUs managed within an organization, most commonly, as a result of needing to re-organize SKU’s association to the markets they can be distributed into (see EU FMD as an example)
  • Serialization requires the need for consistent master data to be shared across supply chain, planning, manufacturing, trade/distribution, techOps, quality and other internal teams
  • Serialization requires master data to be shared with partner organizations (CMO, 3PL), customers (DSCSA T3s and others), industry groups (HDA Origin) and government agencies (EU EMVS and others).
It’s not surprising then when companies recognize the need and are looking for a quick fix- they often turn to their serialization providers.    I completely understand the mindset and thus will cut right to the key point of this post (if you take nothing else away from this article remember this)- Your serialization system, at any level L3/4/5, should NEVER be your master data management system.   While I applaud organizations which have taken on master data activities I too often see them lean on their serialization systems and think they “are good.”

Here are the long-term problems with that approach:
  • Serialization solutions serve the narrow need of serialization compliance for (often) a subset of a pharma/biotech’s commercial products.
  • The breadth of master data elements managed by serialization solutions are directly tied to serialization compliance requirements which again is a subset of the total master data elements needed to support internal and external business operations.
  • Serialization solutions are not built to be master data managers- often lacking the basic ability to perform data validations, data look-ups and version control.
Said another way, serialization solutions should always be a consumer of master data which originates from some other source(s) of truth (A ‘source of truth’ is simply a system, document or process which is recognized as being responsible for managing a piece of data).   These sources of truth represent the center of your master data management strategy from which data can be consistently shared to other systems, teams and processes.

The other challenge for small/medium organizations is often master data capabilities are tied to enterprise systems like ERPs which can be prohibitively expensive and require extensive implementations.  While ERPs are well suited for providing master data management- having visibility across functional areas within an organization- they just simply are not an option for many small companies.   

Seemingly out of options companies turn to a myriad of manual processes and the dreaded spreadsheet.    Over time data gets siloed among the various functional teams and, even worse, data fields are duplicated across teams but with different values.   I’ve seen pharmas have 4 different product descriptions for the same item- finance uses one value, supply chain creates its own, then planning modifies to reflect target market and finally regulatory pulls from the FDA NDC database. 

These challenges are also not things that present themselves but eventually go away on their own- instead they compound over time as more products/teams/regulations are introduced.   I don’t believe every small/medium pharma needs a full-blown master data management team- but if you fall into that bucket I do believe there is merit in looking at your current processes and capabilities (or lack thereof) and identifying some quick wins.  The first goal should always be to identify a centralized master data management solution upon which standardized processes for adding/maintaining the master data can be developed.  Shameless plug- one easy quick win could be this:  Jennason Master Data Manager.

While nobody likes to add projects at the end of the year and so many will be heads-down over the coming months ensuring US and EU compliance my suggestion is don’t let master data fall too far out of focus.   The reality is there are opportunities to instill robust systems and processes in a short time which can then serve as your long-term solution or can act as a proof of concept for a more formal master data strategy down the road.

Hope everyone enjoys the HDA Traceability Forum next week where I'm sure master data will be a big topic.

Monday, June 4, 2018

Blockchain: Grounded Optimism

I considered a number of angles for how to approach this post on blockchain and what I realized is that my thoughts on blockchain are conflicted- I simultaneously have a sense of both immense potential and immense apprehension when considering this trend-worthy technology.  What I concluded is that my polarity in thoughts is actually a reflection of the technology itself- one that seeks to provide the function of a highly secure, centralized ledger in a highly decentralized manner.  My goal for this post isn’t to be ‘for’ or ‘against’ blockchain, but rather provide an opinion of how best to approach the concept and the vendors which provide blockchain offerings.

First and foremost, I am not a blockchain expert.  As someone with a technical background I understand its basic concepts and its place in an overall solution stack but I should not be pegged to give a presentation on the topic anytime soon-  and yet I worry I could- and therein lies my first conflicting thought…

I feel there are extremely intelligent and well-intentioned people putting massive amounts of time and effort into developing solutions leveraging blockchain…….
And I also feel there are many ill-intentioned people putting massive amounts of time and effort into marketing blockchain solely for financial gains….

The problem is:  I can’t always tell the difference between the two.  Part of this is caused by the ambitious, and sometimes downright confusing, claims made about blockchain’s potential.  In the span of a week I saw articles claiming blockchain could solve the refugee crisis, the worlds water problem, and fly a rocket ship.   It’s not that I dismiss the claims, but the marketing is so beyond saturated that the well-intentioned initiatives are at risk of getting lost in the sea of opportunity-chasers. There is immense marketing value in using the word ‘blockchain’ right now.    Even in writing this post I wondered “Will people think I’m talking about blockchain just because it will attract ‘clicks’?”   That’s unfortunate.

The other challenge of such wide-ranging claims is ensuring we don’t lose sight of the reality that blockchain by itself cannot do any of those things…

As a distributed ledger technology blockchain offers tangible security and redundancy benefits, while creating a common data repository that can span across organizations.
As a distributed ledger technology blockchain isn’t a solution by itself, it requires business logic and presentation layers to ultimately make the data relevant and valuable to users

Take the example of diamonds tracked by a solution leveraging blockchain (Everledger).  The headlines you’ll see about that project read something like “Diamonds tracked from mine to customer using blockchain”.   Here’s an interesting observation- I’m not sure in my career I’ve experienced a scenario in which such a highly technical concept (blockchain), usually reserved for the IT department to swoon/debate over, is almost always put front-and-center as the singular driver for making such functional innovation possible. Hence why so many C-suites now have something-blockchain-related on their strategic agenda (yet- with all due respect- most probably couldn’t explain what it does)

It therefore warrants a bit of caution-  back to the diamond example- when diving into the project itself you find that blockchain is just one component in a total solution (e.g. business logic, reporting tools).  Take a look at the sample traceability report provided- http://mydtl.io/t/QSLIS013. It’s clean, elegant, user-focused- but again blockchain by itself doesn’t magically generate that report.  Can you say because blockchain is used as part of the solution the data is more secure and transparent? Absolutely - the overall solution is ‘better’ because of the benefits blockchain provides.  But again, the key phrase there is ‘overall solution’.

The importance of making this point is to ensure industries and those responsible for procuring solutions within organizations recognize when vendors are simply using the term ‘blockchain’ in hopes of gaining a marketing bump vs vendors who actually understand how the technology fits into an overall solution.   To put it more succinctly-  An inadequate or immature technology platform can’t ‘add blockchain’ and all-of-a-sudden become a viable solution.

Finally, let’s turn our attention to where (in my opinion) blockchain comes into play for pharma serialization/traceability.  I think blockchain in pharma is intriguing- there is definite potential but there are also needs to be a sense of grounding.  While I don’t see myself being involved in actually implementing blockchain solutions in the next 6 months, I believe the work that is being done to determine blockchain’s place in the space is immensely important- primarily so that the pharma industry doesn’t get ‘passed up’ in terms of adoption.

I think the industry, by and large, first has to come to grips with the fact that most of the effort and investment made to date has been on solutions that can only barely be considered traceability solutions and instead recognize they are serialization compliance solutions.  This is the first step to then understanding how true supply chain traceability, leveraging blockchain, can provide serialization compliance and also significant business benefits.

I think blockchain is clearly the leading candidate to create the ‘virtual’ centralized database for US DSCSA 2023 compliance that is so sorely missing today.  It will also help if the FDA provides more strict guidance/requirements on blockchain’s place- even so far as identifying a blockchain provider.  It's likely a pipe dream but I say this because, the pharma ‘blockchain’, as is true for any industry, should be governed by an independent, not-for-profit entity (assuming the FDA doesn’t sign up to do it themselves).  Such a path would also provide the opportunity to more firmly enforce the use of global standards.  I worry in lieu of such a guidance from the FDA we will be left with numerous commercial ‘blockchains’ each with their own rules of engagement and integration requirements- essentially replicating the same data exchange challenges the industry faces today.

How vendors approach blockchain is equally intriguing.  Certainly, a new set of names have recently appeared- companies looking to use their foundation in blockchain as their foray into the space.  In the near term this is likely to bring about some interesting new partnerships but over time it’s only natural for these providers to move more into the traditional L4/L5 arenas- something that should only be viewed as a positive by the industry as more competition is still sorely needed.  We also can’t deny that some of the more traditional track and trace vendors don’t want to see blockchain in pharma succeed (despite what their marketing might say) for the simple reason that the benefit of blockchain- providing a connected ecosystem for the exchange of traceability data between disparate organizations- strikes at the very heart of the singular perceived benefit which millions of dollars has been spent promoting.  Identifying such vendors will be easy.  
  • Those who understand blockchain’s role in the space will offer support for blockchain integration- recognizing the need for separation between the ‘open’ data exchange layer (e.g. blockchain) and the applications which can submit/retrieve data from the blockchain and make it relevant to its customers.   
  • Those on the defensive will offer ‘blockchain’ as a component of its own solution- continuing the ‘closed’ mentality and ultimately limiting the benefits its customers realize by adding complexity to the data exchange.

If blockchain does revolutionize pharma serialization/traceability and ultimately provide that pan-industry data sharing/management capability then interestingly the differentiators for pharma serialization L4/L5 providers will experience a long-overdue correction.  Focus will shift (as it always should have been) towards items which today are viewed more as nice-to-have intangibles-  platform ease of use, user-centric reporting, analytics, platform stability and customer service.  Remember- simply adding blockchain to your pharma serialization vendor isn’t going to bring about some magical turnaround- if the platform doesn’t work today, it’s still not going to work with blockchain.

As for practical adoption in the industry- I’d love to see more attention on blockchain pilots focused on the capture/exchange of T3 documents as a starting point- if nothing else to ensure there are some ‘quick wins’ realized.   This will eliminate the complexity of trying to capture everything on a blockchain (e.g. all events) and I think will also be a more approachable concept for the industry at-large to comprehend.

I’m interested/curious/excited to witness the next phase of blockchain in pharma.   The next 12 months represent a pivotal point in understanding how the technology will be deployed, how rapidly and who the main players will be.  My consistent message to the industry applies here as well-   maintain oversight of your serialization/traceability programs to ensure your solutions can adopt blockchain when the time is right while also recognizing it’s not the ‘easy button’ for serialization compliance and supply chain traceability.




Wednesday, May 2, 2018

Integration of serialization into key manufacturing processes

Today's post provides a current assessment of the integration of serialization into common pharma manufacturing processes.  Certainly not a new topic and one that has been covered in numerous articles from colleagues and vendors.  As a point of reference- nearly 18 months ago a great overview was provided by David Colombo of KPMG in Pharmaceutical Online (Read it here). 

How far have we come since then?   Did the industry heed the message or are we still treating serialization 'in a vacuum' and missing integration into key operations?

Let''s dive into two key manufacturing processes where ensuring proper serialization integration is critical.

Serial Number Reconciliation-  No pharma manufacturer (following GMP) would release product without quantity reconciliations being performed and documented in batch records.  Quantity of items packed, quantity of items sampled, quantity of items damaged/destroyed- this is the norm of any packaging operation.  So why should it be any different with serialization?    

With serialization an entirely new data set exists (serial numbers) which must align with the quantities documented in a batch record- So it stands to reason that checking the quantity of SNs against the quantities in batch records each and every time seems like a painfully obvious concept-  many will assume it's being done.    But if you are a manufacturer and have ever experienced mismatches in your serialization data then quite clearly reconciliation was missed somewhere along the way.  (Comment and let me know your experiences!!)

So, what's preventing serial number reconciliations from being a pervasive concept? In many cases it comes down to limitations of serialization solutions.  Many vendor solutions don't support the ability to do serial number reconciliations in two main areas:

  • Packaging site/CMO solutions which only allow the collation and communication of serialization data to manufacturers at the time of shipment.    Following good practices- if a manufacturer only receives the serialization data for a batch after the product has been loaded onto a truck it is way too late in the process.  When designing packaging site/CMO integrations identify trigger points during the batch process which will support reconciliation (such as upon lot closure).    If either your CMO's system or your system can only support integration at time of shipment- be prepared to justify the risk of not being able to perform serial number-to-batch record reconciliation until 1) after you've had to provide batch release and 2) after your product is already in transit.  What's the preferred capability in this area? Some solutions are able to take in batch record inputs (even better if they are electronic batch records!!) and automatically compare against processed serialized data- providing alerts when quantities do not reconcile.  
  • Solutions which do not support capture/communication of 'end-of-life' data (end-of-life covers any scenario where a serial number is no longer continuing in the commercial supply chain).  This has long been a heated debate within the industry with organizations deeply rooted on both sides of the fence- in regards to DSCSA some choose to follow the regulations and only capture data for the 'good' items that get shipped/sold while others will require tracking of all allocated serial numbers.  For what it's worth- I've always been firmly planted on the latter side.   I summarize the debate like this-  there is no right or wrong answer, but if you are only tracking the 'good' items then the extent of your organization's capability is serialization compliance, not true traceability.   Again, for many organizations there is nothing wrong with that as they're likely still 'checking the box', it just means additional areas will have to be addressed before being able to take advantage of other benefits of true traceability (e.g. brand protection, process/quality efficiency monitoring, etc.)


Serialized Shipments-  For many organizations, capturing serialized shipping information is still a future endeavor.   But as 2019 requirements (both regulatory and industry driven) start to surface the need to track items from manufacturing through distribution arises.   One practice, however, that appears time and again is the concept of 'Ship by Lot'.   In the old, quantity-based world this concept had merit as it was an easy way to update quantities of items in ERP systems.  In a serialized world, however, continuing this practice brings significant risk.   The principle is simple-  You don't ship lots, you ship shipments.   

What allows this practice to continue in a serialized world is that often a shipment equals a lot.  Again, in the old, quantity-based world it was easy to say "Ship Lot 123" and then enter the quantity actually being shipped.  In a serialized world, performing the same action "Ship Lot 123" makes a big assumption that all serial numbers which have been previously associated to that lot are now being shipped.   The problem is that organizations are not going through and actually scanning each serial number as part of an explicit shipping step and therefore systems have to 'infer' which items are now being shipped.   The serialized world relies on inference in situations where aggregations exist (physical product packed and sealed in larger physical containers) but shouldn't rely on inference for creating significant traceability steps (e.g. shipping).    

This practice is most commonly seen at companies who have chosen not to aggregate for the time being but are often 'forced' into generating a shipping step by either their vendor or a downstream customer.    Additionally, certain vendors have made this practice too accessible by making it a standard (and even recommended) capability in their platforms.

My summary:  Wait until you are aggregating before you start capturing serialized shipping information unless your volumes are so small you can warrant doing an explicit shipping step.   Moreover, once you are aggregating there is no reason to continue a practice like 'Ship by Lot'.   Instead perform a true shipping step which includes scanning of parent level containers.    

For companies which DSCSA is the first serialization regulation they are having to meet the key is to anticipate what life will be like once the regulations are actually live. Certainly, the focus needs to be on ensuring you are meeting the 'letter of the law', but don't overlook how properly integrating serialization into key manufacturing processes can help identify issues with your serialization data well in advance of it becoming a compliance item. Ultimately manufacturers are responsible for managing serialization within these common processes on a day-to-day basis- Don't let your vendor push you into practices that are easier for them to support, but introduce risk to your operations.

    Sunday, April 15, 2018

    Needing to comply with EU FMD? Don't miss this critical GS1 Guidance on the proper use of NTINs in EPCIS

    Recently I highlighted an important GS1 guidance which aimed to correct a common misuse of GLN/SGLNs in EPCIS events.  Next in our series is another GS1 guidance focusing on the proper encoding of NTINs in EPCIS events.   The full position paper can be found here.

    As a general note, GS1 authors numerous position papers which are immensely helpful yet unfortunately often don't get the visibility/attention they deserve.   Check out the full set of position papers to ensure your implementations are in line with GS1 standards.

    What is the root issue? NTINs (National Trade Identification Number) are product identifiers which are the result of country or market specific identifiers being incorporated into GS1's GTIN construct.  The NTIN concept is certainly not new but has seen increased significance given its broad use across Europe and thus its key role in ensuring compliance with EU FMD regulations.    

    While there is a significant push to have companies transition from the use of NTINs to GTINs in any market where it is allowed, the purpose of GS1's latest position paper is to ensure in scenarios where NTINs must be used they are encoded correctly when populated in EPCIS events.  This is a critical concept as EPCIS is the de facto standard used to transfer serialization/traceability information between packaging sites/CMOs and MAHs- which in turn is the foundation for the regulatory required notifications to the EMVS (a.k.a EU Hub system).

    While NTINs do inherit some attributes of its GTIN counterpart- such as ensured global uniqueness- NTINs have one significant difference in their makeup.  Unlike GTINs, NTINs do not incorporate a GS1 Company prefix.   Most serialization software solutions depend on the GS1 company prefix to properly convert GTINs from their 'human readable' format, such as you would see printed next to a barcode, to its equivalent (GTIN+SN) SGTIN 'URN' format which is required when populating in EPCIS.  Thus, NTINs not having an explicit GS1 company prefix makes it problematic for software solutions when then trying to encode the NTIN+SN into the standard SGTIN URN format.


    What ambiguity does the position paper clear up?  Most importantly the position paper clearly states that NTINs are to be encoded as GS1 SGTINs when populated in EPCIS events and more specifically "an NTIN must be encoded as a 'one-off' GTIN with a GCP length of 12 digits".  The position paper also provides clear instruction as to how to convert an NTIN+SN into the SGTIN URN format.

    Source: GS1- Guidance on using NTIN in EPCIS visibility events

    How can I determine if my implementation is following GS1's guidance?  Watch for these key indicators that your implementation may NOT be meeting this GS1 guidance:
    • Any reference or mention by your vendor to the concept of an 'SNTIN'.   SNTIN is a fictional term because, as the position paper notes, an NTIN in serialized form takes on the construct of a standard SGTIN.  
    • EPC values which use prefixes such as 'urn:epc:id:ntin:...' or 'urn:epc:id:sntin:...' are not compliant.  GS1 does not sanction any prefix under its namespace that includes 'ntin' or 'sntin'.
    • A lack of GS1 Company prefix configuration in your serialization solution.   Serialization solutions must allow for configuration of known or valid GS1 company prefixes in order to properly translate between an NTIN/GTIN +SN human readable format and its corresponding SGTIN URN format.  Without a GS1 Company prefix configuration solutions will force users to duplicate its GTIN/NTIN entry in both human readable and URN formats.  This adds significant overhead to master data management/configuration as well as an increased quality risk that the entered human readable and URN formats do not properly correlate.
    When made aware of GS1's position paper this week one top pharma indicated (in regards to feedback provided from their serialization solution provider) that "everything we’ve been told previously goes against this."   This is concerning feedback as it indicates many companies are already down the path of mishandling NTINs encoded in their EPCIS integrations. 

    As noted in prior posts, when non-compliant practices become widespread the greatest consequences ultimately fall back on the industry itself in the form of additional time/effort/cost.  By not enforcing Standards compliance, serialization solutions have to adapt to support both the compliant and non-compliant approaches- and this comes at the expense of the industry (e.g. customers) who end up having to pay for new integration 'maps' or platform 'enhancements' to the same solution providers who incorrectly handled NTINs in the first place.

    As before the message to the industry is simple- increase the oversight of your implementations and leverage the wealth of readily-accessible, free content available to you. Leaving your implementations unchecked will result in highly customized and/or non-compliant partner integrations costing you more money in the long run.

    Monday, February 26, 2018

    Next phase of EPCIS hopes to bring standard in line with long proven technology concepts

    Excited to see leading technology concepts being applied to EPCIS as part of GS1's EPCIS 2.0 call to action (Link)

    JSON and REST have been leading formatting and integration approaches in IT applications across many industries for many years.
    • JSON- compact data format/messaging that has shown significant benefits over XML, especially in high volume use cases which are paramount in serialization/track & trace applications
    • REST- the overwhelming leading A2A/B2B integration approach, with significant benefits over asynchronous integration methods such as AS2 and the heavily outdated SOAP protocol.
    It will be interesting to see how pharma vendors supporting EPCIS adapt to these potential changes.   Concepts like JSON and REST have been used in serialization/track&trace applications in other industries for many years.   Time for pharma to catch up, and make no mistake, this isnt about the industry looking for new and emerging technologies like AI and blockchain.  This is about the industry catching up to technology trends that were happening 5+ years ago. 

    Take a look:

    "Rather, the kinds of organizations that favor SOAP tend to be slower to change and more heavily driven by integrating with other government agencies that are behind in technology, or by grant programs tied to a particular technology stack, and they often have legacy systems that require the use of SOAP. "

    "Service-oriented architecture (SOA), which gained wide acceptance using web services built on SOAP, has been popular within organizations as a mechanism for sharing information across the enterprise. However, the use of a REST architecture, along with associated technologies such as JavaScript Object Notation (JSON), is accelerating the development and use of APIs. Some of the most popular services such as Twitter, Netflix, and Facebook are now processing API calls on the order of billions per day or month."





    Technology adoption, such as this, is fueled by the vendors whom the industry puts their trust in to provide the best solutions.  In pharma this certainly won't be seamless, however, as some large players have long ignored support for these concepts.    

    You don't need to be an expert on these terms (leave that for the IT folks) but if your current implementation is heavily rooted in antiquated technologies like AS2, SOAP (and soon to be XML) or hasn't committed to a clear enhancement path as part of their roadmap you're well behind the curve.

    Monday, February 19, 2018

    Important Public Service Announcement from GS1 and what it means for your Serialization Implementation

    GS1 recently released a PSA (Public Service Announcement) noting a specific practice which is currently widespread in pharma serialization implementations and is decidedly non-compliant with GS1's EPCIS standard.  In addition to being non-compliant the practice also infringes upon GS1's property.  See GS1's PSA at the end of this post.

    Following GS1's lead I will help bring attention to these types of data compliance and quality issues through this blog.

    What is the root issue? In simplistic terms many serialization vendors are communicating ’where' traceability events occur in a non-compliant fashion by populating the GS1 GLN (Global Location Number) incorrectly in EPCIS events.

    How widespread is this issue?  Considering multiple 'top' L4/L5 serialization vendors support this non-compliant practice this is a highly widespread issue.  Numerous sources brought this to the attention of GS1 leading to the PSA.  Additionally, proper use of GLN/SGLN has been part of the EPCIS standard since its inception over 10 years ago- So this particular item is alarming not only because of its pervasiveness, but also because of how basic and longstanding the underlying concept is.

    How does this impact me?  
    • Non-compliant EPCIS, in any form, leads to challenges in sharing traceability data with downstream partners (Distribution, 3PL, Customers)
    • Not addressing data quality items, such as this, immediately will end up costing the industry more time and money to correct these issues as the volume of live, serialization data increases (often paying the same vendors to fix their own mistakes)
    • Widespread issues such as this introduces contractual concerns for those responsible for providing compliant EPCIS integrations
    • Tolerating non-compliant data in your serialization implementation significantly reduces your ability to harness emerging technologies such as AI and blockchain which demand good quality data
    How can I determine if my implementation suffers from this issue?  The key here is don't just take my word for it- you must check your data for yourself. 

    I know for some that's easier-said-than-done but fortunately there is no lack of services and tools to quickly find standards compliance issues such as this:
    • The late Ken Traub developed an immensely helpful, FREE tool which will instantly tell you whether an EPCIS message is compliant, as well as provide a summary breakdown of its content.  (Link)
    • I have offered a FREE service to analyze serialization data and provide compliance feedback (Link)
    • The Jennason Serialization Test Tool provides automated serialization test data generation which can be used to ensure your L4/L5 solution properly supports the standards and can also detect non-compliance (Link)
    If the options above are still too much effort ask your L4/L5 provider if any of your data is non-compliant per this PSA.  Then let me know their response- I'm eager to hear!!

    The message to the industry is simple- take responsibility for your implementations by leveraging the tools readily available to support you.  Otherwise the 6.6% compliance rate we saw with DSCSA barcodes will seem pretty good by comparison.

    I appreciate the work being done by Ralph Troger (GS1 Germany) and Craig Alan Repec (GS1 Global) to educate the industry on the proper use of standards by highlighting these non-compliant practices at both a global and country level.  A second 'thank you' to GS1 Brasil for their efforts in making this PSA visible to their members. 


    I look forward to GS1 US taking the same vigor in protecting their property and standards.


    GS1 Public service announcement on proper use of GLN/SGLN in EPCIS
    GS1 Public service announcement on proper use of GS1 EPC URIs

    Sunday, February 11, 2018

    Lawsuit against HDA dropped.


    Notice that Tracelink dropped its lawsuit against the HDA regarding the Origin service was released this week.  Read the press release here (Link).

    The absolute correct outcome, in my opinion.   From the start this lawsuit misrepresented the scope of Origin and read more like a bloated piece of marketing material.  I think of no other term but ‘frivolous’ when I see it publicly stated this case was based on ‘misunderstandings’.

    The practice of wholesalers/organizations setting the tone for track & trace adherence in the industry is nothing new.  If this practice caused such an anti-competitive environment for L4/L5 solution providers why didn’t we see lawsuits three years ago when T3 electronic exchange was ‘forced’ by some wholesalers even though not required by DSCSA?   The answer is because, at that time, providing electronic lot level T3s actually kept some of these same solution providers in business.   Seemed a little too convenient to start complaining about this practice now. 

    My personal experience has been nothing but openness and collaboration with the HDA/ValueCentric teams who position the Origin service as a very complementary offering to L4/L5 solutions.   I look forward to my future efforts with them. 

    The fact is the Origin service offers a very real and necessary function which is to guarantee the accuracy of product identification (GTINs) and related attributes.   We just saw how well DSCSA barcoding is progressing (Link) so services like Origin attempt to ensure we don’t repeat the same utter failure when it comes to DSCSA data.  Unfortunately, it only addresses one piece of the puzzle- product master data- so many data issues will still arise. Moreover, everyone should recognize Origin is not the only service that can meet this need.  Data pools, like GS1's GDSN, have for years been providing master data collaboration for numerous industries and across all segments (manufacturing, distribution, retail).  However, the reality is most pharma companies have not advanced to the point of adopting GDSN as part of a master data strategy and thus why services like Origin fill a specific and immediate need.

    As for the press release, I’m encouraged by the mention of aligning with standards but unfortunately believe this is just marketing fluff once again- I’m happy to be proven wrong but don’t quite see the need for more workgroups/new standards when no less than two GS1 standards which support master data exchange have existed for 10+ years.  For me this, again, falls into the category of being a little too ‘convenient’- making public claims of alignment with GS1 standards for the marketing credit but then do little to actually deliver support in solutions.  It certainly wouldn’t be the first time this has happened- and continues to be pervasive- with numerous solution providers in this space.  

    So, what does this all mean going forward?  For many this lawsuit was an interesting side-story to keep an eye on, but others may have some very real impacts and questions to be answered.  We now have one of the major track & trace providers who has undoubtedly caused many key wholesalers (who also operate many of the largest 3PLs) and their primary industry association to expend significant time and money to defend against this lawsuit. Do we really think that relationship is all warm-and-fuzzy?

    Why is this important?  To this point there has been a practice for many pharmas to select L4/L5 vendors simply based off who their packaging provider/CMO recommended- but now the focus is rapidly shifting to integration with the same distribution/3PLs and wholesalers named in this lawsuit- all in support of the next wave of DSCSA milestones in 2019.  So now these pharmas, who took a short-term view into serialization compliance by focusing on packaging only, are left to wonder how the next phase of their serialization program will play out.  

    Monday, February 5, 2018

    A precursor of what's to come? Barcode issues are only the start...


    GS1 US recently released a telling article about the current state of barcoding for DSCSA compliance (Link).  A great review and opinion from Dirk Rodgers was also released on Monday.  A highly suggested read (Link).

    The crux of the study is that approx. 6.6% of all serialized barcodes evaluated by two of the largest wholesalers met expected quality requirements and DSCSA compliance.  The results of this barcoding survey, in my opinion, are staggering to say the least.   

    Let's put this in perspective- imagine if the results had come back that 50% of the barcodes did not meet DSCSA compliance?   If these results held true, that means that in just 10 short months half of all products distributed in the US should be set aside for non-compliance.  That would be catastrophic to the industry.

    So that fact that only 6.6% of barcodes evaluated met two of the largest wholesaler's expectations for quality and DSCSA compliance isnt just staggering- it should be a signal to the entire industry that the traditional ways of approaching serialization simply are not working. 

    But what may be more interesting is-  How does the pharma industry view these survey results?   I'd love to think the majority of the industry is gravely concerned.  My fear is most are not.  My fear is many take a position of either:  A) I don't care,  B) That can't possibly apply to me because I use all of the 'leading' vendors or C) If that many are non-compliant then I'm probably in the same boat as everyone else and therefore don't need to worry about it.

    It would generally be 'easy' to pin the blame for these results on CMOs/packaging sites- but doing so would miss the mark.  Non-compliance of this magnitude indicates a failure at multiple levels- this is an issue with both business and technical involvement, regulatory interpretation, requirements gathering, GS1 Standards education and, certainly, execution.    I have no doubt the majority of these implementations were fully tested/validated- which means these implementations were doomed for failure before they even started because they were being tested/validated against requirements which were already non-compliant.    

    Part of this blame certainly goes back on vendors who are depended upon to be experts in this space.  Vendors at all levels should be able to recognize the issues that led to 6.6% compliance- that's not situational failure, that's systematic failure.  Even enterprise vendors, where most of my experience is rooted, are subject.  If your enterprise vendor didn't recognize these issues-  red flag.

    But vendors can't be the only ones in the cross-hairs.  Whether the industry wants to admit it or not there has long been a mentality towards serialization to do as little as possible, as fast as possible for as cheap as possible.   This can't be ignored as a contributing factor.   A lack of necessary oversight and attention to detail has put the industry in the current situation and the scariest part might be- how many companies don't even know they have issues or are led to believe everything is 'OK'?

    We now have a clear indicator of how well that mentality is working out for the industry.  And (sorry for the doom-and-gloom) my estimate is its only going to get worse when the focus shifts from barcoding to serialization data.   

    The results from the GS1 survey are more likely to be a telling indicator rather than an unexpected blip on the industry's march towards serialization compliance. Where we go from here is in the hands of the industry-  the documentation, tools and services exist to help pharma companies get this 'right'.   Whether the industry chooses to pay attention and take the necessary steps remains to be seen.    

    Sunday, January 7, 2018

    A free service to expose the truth about your serialization implementation

    Drawing upon over 10 years of experience implementing almost every major serialization solution on the market today Jennason is offering a free evaluation of your serialization implementation- focusing on its adherence to GS1 standards and its ability to meet global regulations.

    The reason for offering this service now is simple:  the industry is being significantly mislead through false claims and misinformation about the proper implementation of GS1 standards which are a necessity to make industry collaboration possible.  By offering a free service my hope is companies see no barrier in looking further into the problems which are besieging their implementation.

    Reach out to Jennson to learn more about this free offering and to hear actual use cases of pharma companies who confronted these issues and took action to get their implementations back on track. 




    Popular Posts