The process of software validation is fraught with long lead times, redundancy and expense. The tools most commonly used are good ones, but by making better use of technology they could be improved – at least according to Kevin Ballard, director of software validation at MasterControl.
But let’s back up a bit. To start, software regulation in medical devices is fairly straightforward.
“Any customer that’s regulated by the FDA falls under the requirements of needing to validate the software they use for 21 CFR part 11 functions,” says Ballard. “The validation methodology isn’t specified by the FDA. They just say you need to validate the software to make sure that it’s going to work for how you intend to use it and then show us that you did it that way.”
Ballard explains that the traditional tools are IQ, OQ, and PQ. Installation qualification (IQ) tests to ensure the software are installed in a site and environment that should function.
“It includes making sure there is a enough memory, hard drive space, network connections, and other things like that,” he explains.
OQ or operational qualification addresses the functional requirements of the software, e.g., “making sure that this button does this and that this field accepts the right information, and that the link takes you to the proper page. It’s very granular.”
Traditionally OQ has taken the longest amount of time. The FDA has a guidance document that recommends that customers leverage their vendors testing activities and documentations. Rather than re-testing everything, they’re encouraged to use Vendor A’s test documentation and results as part of that evidence. But Ballard says not many are able to get that information from vendors.
“Imagine if you tried to get test records from Microsoft,” he says. “It’d be pretty difficult to get.” Which means that in most cases, those processes are run multiple times, both by vendors, and then by manufacturers.
The last tool is performance qualification (PQ), which involves testing the software in the user environment to ensure that it will be responsive for the way the customer needs to use it. For example, if a customer needs to have a certain workflow that is unique, they need to do testing to ensure the software is going to function with that workflow in that environment.
Software vendors have done a lot to help manufacturers truncate the time spent testing. MasterControl, for example added a process in 2006 that tests OQ basic functions and documents the process. After that, the company found that PQ was the next most burdensome part of software validation, and set out to provide better services on that end as well.
“OQ had been the big elephant in the room and once we cleared it out we realized there was a rhinoceros sitting in the other corner,” Ballard says.
“One of the reasons this part of the process is difficult is because medical device manufacturers don’t necessarily know what their customer’s business preferences are,” he explains. “They don’t know how they want the software to work until they are in implementation or late configurations.”
In fact, Ballard says there is a lot of OQ testing during PQ, “and it is usually redundant and inefficient. On top of that, FDA recommends a risk-based approach, “which means you look a the testing that’s been done and then assess what testing needs to be left.”
That risk assessment can be daunting and there are a lot of manufacturers who will just re-test everything because it’s perceived as being easier than doing that risk analysis. Ballard estimates that PQ testing can take MasterControl clients between 2 and 6 months.
MasterControl’s solution to these challenges is a cloud-based environment called Spark.
In reality, there is very little variation between customers, so they can use the same configuration. MasterControl created a PQ in the cloud so that customers can productize it and generate good documentation.
“That’s TPQ, or transfer PQ,” he says. “With Spark customers, we can minimize or even eliminate the PQ portion of testing needed for each upgrade.”
PQ, says Ballard, can be tricky because you have to test the customer’s configuration. The Spark environment works because it relies on a set of preconfigured scripts.
“For example, we have a good manufacturing practices [GMP] configuration. There’s no variation in how they’re configuring, how they’re using routes, how they’re using vaults, how they’re approving documents. The only real variation is the users that are in the system,” Ballard explains. The hardware standardizes all of those implementations into one environment and configuration.
“Because they don’t have a lot of variation in that configuration they can leverage our PQ scripts. If one of those customers need to set up a new approval process or something that they deemed critical and it doesn’t fit within the tests that we do, they would need to do some testing on top of that,” he says.
Ballard says the feedback he’s been privy to from customers has been positive. “They appreciate being able to go live a lot faster and being able to, instead of using the resources for often redundant testing, use those for their business and actually using that software with resident testing.”