top of page
  • AduroSys

Software Validation with Mack Soneh - Software Validation 101

Amber: [00:00:05] Hi, everyone, this is Amber Shao, Founder and CEO of AduroSys, a laboratory data management software company. Welcome to AduroSys lab software podcast. Joining me today is Mack Soneh. Mack is a principal consultant at Venturescope Consulting located in Sunnyvale, CA. Mack started software development in 1984 and was introduced to the world of software validation at Bell Laboratories. Since 1995 he has been applying software validation methodologies and techniques for the FDA regulated industries, especially in the In-Vitro Diagnostics fields known as IVD. Companies he helped successfully by applying the software validation methodologies and techniques include 23AndMe, GRAIL which was acquired by Illumina, Johnson&Johnson, Abbott, and other Silicon Valley start-ups.


Amber: [00:01:00] Hi Mack, welcome to the show.


Mack: [00:01:02] Hello, how are you doing today?


Amber: [00:01:04] I'm doing great. Good to have you on the show. So, you know, I have worked with Validation Consultant quite a few times to validate our software and I do find the work behind the validation is quite complicated, And there seems to be a lot of confusion behind what to do, what not, and what software to validate and what not. So this is why I'm very excited to have the opportunity to speak with you because I know you have extensive experience in this software validation area as well as software development. So in the next few episodes, we're going to cover various topics behind the software validation. And I hope this will shed some light on this topic for our listeners.


Amber: [00:01:53] Let's start with some basics on software validation. What is the software validation and why does one need it?


Mack: [00:01:56] It is a series of activities to establish confidence that the software is fit for use or its intended uses. Now, intended uses or fit for use or the big word. Let me give you some examples. Like if you look at the thermometer and temperature. And if you're just cooking some soup and you want to measure the temperature of the soup, if it's like a 90 degrees Celsius or 91 degrees Celsius, it doesn't really matter to you as long as it's hot enough and it's tasty. But if you're not feeling well and try to measure your body temperature. Now, if your body temperature measures up to like 37 degrees Celsius or thirty-six degrees Celsius, there is a huge difference in terms of interpretation of your body temperature. So internal uses mean the application of the particular use of the temperature or the device in this case.


Amber: [00:03:00] So it's almost like you have to make sure the software works in the way how you want to use it, whether you're using the thermometer for the soup, or you're using it to measure a real patient. The expectation is very different. So that's why you want to validate it so that to make sure it's for a specific purpose.


Mack: [00:03:22] Yes, that's exactly. Let me just come back to the same example for intended uses. If the body temperature goes to like a 35 degrees Celsius or 34 degrees Celsius, then your body temperature is way too low. The software's supposed to give you some kind of warning because you need to really seek a doctor's help immediately. And that's the kind of thing the software is required to be fit for use for intended uses.


Mack: [00:03:56] Is this a general definition or it only applies to a specific field? I mean, the industry I'm in, it's mostly related to the lab, biotech, life sciences. So is this definition just for this area?


Mack: [00:04:09] It's the general principle of software validation. But for the medical field, it concerns with the human body and human health. So the intended use is always revolving around human health, as opposed to, say, some type of cryptography and cryptography validation is to make sure the cryptograph strong enough so that it cannot be easily broken for intended uses. You know, intended uses and software principle is universal. It's applied to the health industry that evolves around human health.


Amber: [00:04:56] I know there's a lot of terms. I think one of the things I get confused when I work with valediction is around the terminology. There are different terms that I find it's not that easy for people to just kind of grasp very quickly. You know, for example, I use the term software validation, but I also heard the term computer system validation. Are they the same, or they're different?


Mack: [00:05:22] Well, that's a very good point, because nowadays computer systems validation and software validation are interchangeably used. However, there is a historical background because software validation is something FDA uses for all medical devices, especially in the early days of FDA paying attention to the software. The software validation most often strictly applied to embedded software. Embedded software, the kind of things like software goes into a handheld device or the kind of machinery. Whereas computer systems validation applies to mostly the pharmaceutical industry, where the drugs, some pills are manufactured, being manufactured in a huge factory environment. So computer systems are connected to factory manufacturing machines in the pharmaceutical environment. So those are the computer systems, almost like the production, manufacturing, computer systems validation. And computer systems validation is recognized by, you know, by the Pharmaceutical Association Standards Group called GMP. So those are historically two different comings from two different sources. But nowadays those terms are interchangeably used.


Amber: [00:06:57] And also the other thing I heard often is validation could be synonymous with testing. Is that true?


Mack: [00:07:06] Well, it's not equal. It's the software validation involves more than testing, but software testing is definitely part of software validation. Software validation includes more activities because software testing alone is not enough to, coming back to my original point, it's not enough to assure that software fits for intended uses.


Amber: [00:07:37] So validation activity is much bigger sets.


Mack: [00:07:40] Much bigger sets. Let's go back to this thermometer example for lower temperature, temperature measurement software. It's supposed to give a warning if the temperature goes way too low. And telling users, please seek a doctor’s assistant immediately. The kind of warning is necessary. But if it wasn't part of the design and if it wasn't part of the original software specification, then the tester will simply check if the software displays the temperature, the correct number.


Mack: [00:08:20] There was the real case, real published case among FDA back in the late 1990s when Johnson Johnson's handheld medical device had this kind of design built-in. So when the blood glucose reading came way too low. And the old lady was measuring her blood glucose. Because it wasn't really designed to warn her that if the blood glucose level falls way too low, and the software only generated an error message. Error message saying error, error, error. And so the old lady didn't know what to do and did not seek immediate help from the doctor, so she passed away. And then the FDA looked and looked at the situation. Eventually, the product was recalled and the product has to be pulled out from the market. Now, did the software tester do anything wrong, no software tester performed the testing perfectly according to the specification, but that people didn't challenge the software design in terms of the risk assessment. Risk assessment should point to that when the blood glucose level was way too low. That's the danger to the user. Therefore, our message to display saying please seek immediate help instead of just displaying an error message saying error. So software test doesn't really challenge the design and you have to challenge the design, and in order to challenge the design of the software, you have to look at the software development process, therefore software validation includes the review of the software design process.


Amber: [00:10:20] That's a very interesting story. It makes total sense now.


Mack: [00:10:25] So I understand that based on what type of labs the software is for, and there's different regulation the software should be complied with. For example, FDA CLIA, GxP, or 21 CFR Part 11. Those are just a few I have come across. I'm sure there's plenty more. Could you just briefly explain what they are?


Mack: [00:10:52] GxP is like a sort of buzzword. GxP X is supposed to be replaced with GCP, C stands for clinical, so Good Clinical Practice; Good Manufacturing Practice, GMP. The middle character changes, depending on the use of the application of the regulation, so, therefore, people just come up with this. Instead of saying, GCP, GMP, GLP, Good Lab Practice. They put X. Try to apply this FDA good practice regulation to the broader Category. 21 CFR Part 11 is the FDA's regulatory for electronic signature and electronic record. And this was the FTC intention to replace a paper copy of the data and paper copy of the signature. So there are two portions. One portion is the electronic signature. We have to prove that electronic signature or digital signature is equivalent to the paper signed that you signed and then electronic record that you have to prove that you are generating computer-generated data and records are equivalent to a paper record so that it can, you know, it cannot be duplicated. And if it's copied and it's supposed to say it's copy and it cannot be tampered with, it's harder to change the code on the paper, but it's much easier to change the digital record. Therefore, once you generate this record and if you are in the compliance field and try to comply with 21 CFR Part 11, then your digital record supposed to remain as untampered.


Amber: [00:12:53] Validating the computer system is such a lengthy and resource-intensive process. A lot of people are complaining about it. So from your experience, is there any other benefit coming out of this process other than satisfying the regulatory requirement?


Mack: [00:13:11] Let me just quote FDX finding, because when the FDA approves a medical device, which includes in vitro diagnostic, medical test lab tests, when those medical devices are approved and go to the market and then people complain, mostly filed by hospitals and medical practitioners. And so FDA corrects those complaints and they looked at the nature of the complaint. They found roughly one third, 33 percent to 35 percent, roughly one-third of the complaints are indeed in software failure or software fault or software error, something to do with software. So that's why FDA wants to have more strict FDA regulation on the software and requiring validation. By the time software either as the lab process or part of the medical device, it is released to the public. It is crucial the software functions as intended and will not cause any unnecessary harm to the user.


Amber: [00:14:30] Yeah, that makes sense. OK, so we covered some basic concepts on software validation. Thank you so much for sharing that information with us.


Mack: [00:14:39] You're very welcome.


Amber: [00:14:41] Yeah. In the next few episodes, we're going to dive into more specifics on the validation process and methodology. I look forward to speaking with you again.


Mack: [00:14:51] Sure.


31 views0 comments

Recent Posts

See All
bottom of page