We’ve been getting a lot of requests, lately, to help companies decommission their old GxP systems. The systems were once validated and still contain GxP data that might someday need to be reviewed or analyzed, but they’re no longer being actively used. They’re just sitting there, taking up server space, undergoing basic IT maintenance, and being subject to annual license and support fees.
Why are they just sitting around, consuming resources? It seems like there’s a lack of…understanding? process? confidence? in exactly how one goes about formally decommissioning a regulated system. So, I thought I’d demystify it for you. After all, when done well, decommissionings are generally pretty small projects that can save organizations a nice chunk of change.
We handle decommissionings like any other change control – we assess the change risk using our standard criteria and the project deliverables are based on the resulting risk level (low, medium, or high). Decommissionings are usually low risk, which means the project deliverables are few: a plan, a test case, a data export, a log, and a summary report. For most of those deliverables, we have templates in our Template Library (even a decommissioning test case!), which makes the process quick and smooth.
In the event that an organization wants to migrate the old data into a new system or into a data warehouse, we simply add a migration qualification (MQ) phase into the project. Depending on the complexity of the migration, the added MQ might raise the risk of the overall project, but we still follow the same basic process.
Our approach is to take a backup of the database in question, restore the backup in a separate location, and run a test case to make sure the backup worked properly. If it did, we proceed with moving the backup to its permanent home (usually burned to a DVD) and dismantling the system. If the testing fails for some reason, we take another backup and retest it until we’re sure we have a backup that works.