Anyone that has been part of a system migration knows that is not easy. As you can see from the graphic below, a typical life science system migration plan has many steps. Companies tend to see the system itself as an investment, while viewing any associated data migration effort as a necessary but unfortunate cost, leading to an oversimplified, underfunded approach.
With an understanding of the hidden challenges, managing the migration as part of the investment is much more likely to deliver accurate data that supports the needs of the business and mitigates the risk of delays, budget overruns, and scope reductions.
One of the most difficult aspects of data migration is the actual data migrations. Standards, such as the eCTD, shine in these situations. That is, everyone has agreed to one format that can be used to import data to any system.
Unfortunately, it is not that easy.
The true challenge of eCTD migration is your impact analysis (i.e. downtime, architectural changes and environmental changes) and determining whether or not your selected vendor supports unwritten Agency rules. Bear in mind that subtle differences exist among vendors who have implemented eCTD systems. You need to ensure that your chosen vendor has experience migrating submissions, not only from your existing system, but from all systems that generated eCTD that you now have to manage.
Implementation (SOPs, system validation, pilot run, training) and release plans (monitoring and continual support) are a part of all software changes. I don’t want to minimize these efforts. A good plan is needed, but these challenges are not unique to eCTD.
Not surprisingly, the US FDA receives the most complex submissions with the most nuances. The Agency also handles the largest workload in terms of submission volume and has in place the most intricate systems. To further complicate matters, every vendor implemented slight variations of the eCTD standard.
I wrote about one such example of variation in eCTD implementation back in November 2014 ̶ the undocumented rules that apply to lifecycle management of Study Tagging Files (STFs). GlobalSubmit worked with several vendors, helping them to understand and implement the rule. Most vendors chose to implement the rule but a few did not. If your vendor did not implement the unwritten rules for STFs, migration will be nearly impossible in the US. Fortunately, this is just a US issue and most vendors have implemented these unwritten rules. Nevertheless, make sure you ask you vendor during the vendor selection process.
While the majority of other regional agencies have chosen an eCTD review system, the majority of the reviewers, outside of the US, are not using their system. The reviewers simply go directly to the files and folders. Such a setup simplifies migration since the files and folder structure are easily migrated.
Based on submission size, complexity and business need, there are two principal migration strategies available to choose from – big bang and trickle migrations.
Big bang migrations involve completing the entire migration in a small, defined processing window. In the case of a systems migration, a big bang migration involves system downtime while the data is extracted from the source system(s), processed, and loaded to the target, followed by the switching of production over to the new environment.
Big bang migration can seem attractive, in that migrations are completed in the shortest possible time, but the strategy carries several risks. Few organizations can live with not being able to publish for an extended amount of time. Businesses adopting this approach should plan at least one dry run of the migration before the live event and also plan a contingency date for the migration in case the first attempt has to be aborted.
Trickle migrations take an incremental approach to migrating data.
Rather than aim to complete the whole event in a short time window, a trickle migration involves running the old and new systems in parallel and migrating the data in phases. The trickle approach inherently provides the zero downtime that publishing requires. A trickle migration can be implemented with real-time processes to move data, and these processes can also be used to maintain the data by passing future changes to the target system.
Adopting the trickle approach does add some complexity to the design, as it must be possible to track which data has been migrated. If source and target systems are operating in parallel, users must switch between them, depending on where the information they need is currently situated. Alternatively, the old system can continue to be operational until the entire migration is completed, before users are switched to the new system. In such a case, any changes to data in the source system must trigger remigration of the appropriate records so the target is updated correctly.
Big bang migration is the superior strategy, if you can limit the downtime of the system. An incremental approach requires a tremendous amount of planning and adds complexity for the publishers toggling between the incoming and outgoing systems. Processing submissions at a high speed is a key to executing the big bang migration strategy. And once the initial migration is complete, you’ll want the ability to migrate just new submissions.
But more importantly, your system needs to be able to import all of the subtle eCTD differences that exist and process the unwritten rules from the FDA. The FDA has done a great job with presentations and answering individual questions via email. I believe my company, GlobalSubmit, has been very active in educating the industry on a number of these unwritten rules. Unfortunately, Agency guidance documents have not been kept up to date. I would like to hear your thoughts on how to best inform industry about these unwritten rules.