About This Episode:
In today’s show, we’re diving deep into the highly specialized world of software testing for medical devices. Our guest, Priyank Soni, brings over 20 years of experience and a wealth of knowledge from the front lines of medical innovation, think cardiac mapping, lab automation, and infusion pumps.
Not only is Priyank a recognized authority in embedded systems, AI, and machine learning, but he’s also a master at transforming manual QA processes into scalable, cloud-based automated frameworks.
Join host Joe Colantonio as he and Priyank unravel the unique challenges of testing in a heavily regulated, high-risk environment where safety, compliance, and innovation must work hand-in-hand.
You’ll gain practical insights into navigating stringent standards like ISO 13485, managing risk in legacy systems, and balancing Agile development with rigorous verification requirements.
Plus, Priyank shares the real impact of AI and automation in medical device testing—and what the future might hold for testers in this rapidly evolving field.
Whether you’re a seasoned QA pro or just curious about what it takes to deliver safe, compliant software in healthcare, you won’t want to miss this episode!
Exclusive Sponsor
Discover TestGuild – a vibrant community of over 40k of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.
We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.
Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together now https://testguild.com/mediakit
About Priyank Soni
Priyank Soni is a passionate R&D Engineer with experience working in medical devices field and deeply involved in design, development and automation of software running on embedded systems.
He gives priority to lead by empathy and have mentored interns, CO-Ops, experienced engineers in executing projects based on Agile methodologies and adhering to regulatory guidelines. He empowers teams to take challenges and deliver innovative and reliable solutions that improve the lives of patients.
Connect with Priyank Soni
-
- LinkedIn: www.priyank-soni-b049735/
Rate and Review TestGuild
Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:34] Hey, want to learn more about testing medical devices? If so, you're in the right place because we have the honor of having Priyank joining us. He's an expert with over 20 years of experience driving innovation in the medical device industry. He's led the development of cutting edge solutions and cardiac mapping, blood recovery systems, lab automation, and infusion pumps. Really knows his stuff. He's also an expert in embedded systems, AI, ML, and software test automation with a strong track record of transforming manual processes into scalable cloud based test automation frameworks. Really excited to have them on the show. We don't have a lot on this topic. You don't want to miss it. Check it out.
[00:01:12] Hey, Priyank, welcome to the show.
[00:01:16] Hey, Joe. Hey, good morning. How are you?
[00:01:19] Joe Colantonio Great, great, great to see you. Great to have you. I guess the first question is, I always ask people, how did you get into software testing?
[00:01:25] Priyank Soni Software testing, very interesting thing actually happened. I actually started in consumer electronics working on printers. How do you compare comparison between different PDF files looking at software, which is Postscript and going into C++, working with embedded systems. And at one point, I was developing pretty in-depth code that was doing a lot of reading and writing and a lot like threading, mutex, and a a lot of communication between different modules. And the project manager said, how will you test this? How will you ensure that this always works? And that actually was my starting journey to go deeper into what is actually a requirement. How do you ensure that it meets the expected need of the user? And there we go, like the magic happened. It's all about unit testing, making sure when you write, it works, it works according to the intended use. It works in different environment. It works in different input conditions and it meets what the user is expecting. That was my starting point. And then I said, okay, if this is, I want to do, I would better find a place where there's compliance, there is regulation. And that's how I ventured into medical devices where when you write, your end of the day there's somebody out there whom you're treating, you're making sure that the device works. And it could be your friends, it could one day be your family, who knows, right? And that's what is basically driving me every single day. How do you produce a software that is going through a lot of compliance, that are going through lot of checks and balances, and ensuring that when it's used, it's always safe and precise.
[00:03:13] Joe Colantonio I love that because a lot of people, I think when they hear a compliance regulation, they're like, I'm out. That's too much pressure. How do you deal with that pressure? I assume there's a lot of regulations you have to deal with. You have to be on top of your game to know, like, maybe what is expected and not slack. Like, how do you keep up your quality with what's expected of you from a compliance and regulatory type of framework or mindset?
[00:03:36] Priyank Soni It always goes with the curiosity, like how much curious you're know to about the processes. How much you're willing to go beyond your regular day-to-day job and understanding compliance from different regulatory bodies. How much are you actually reading about what's happening around you, looking into concepts of software development life cycles that start always with user needs. How much aware about how these devices will be used in the field in different settings. So that basically drives a lot of retrospective, right? How actually you started where you are and where you want to be with respect to medical safety. So it all starts with, you the risk hazard analysis. You ensure that, you are trying to develop a software with certain constraints. You are actually just not working in silo. You might be working with a lot of different third party tools. You might working with connected, disconnected systems. You might be working with real-time system where you're getting a lot of information and this whole flow has to follow certain kind of checks and balances.
[00:04:43] Joe Colantonio All right, so that's another scary thing. You mentioned risk and all these connected systems. How do you identify risk? How do know you're testing what really could cause issues rather than just testing things just because it's low-hanging fruit?
[00:04:57] Priyank Soni It always goes back to the architecture.
[00:04:59] Joe Colantonio It always goes back to the user needs, how well you have defined your user needs. How well you've defined your system and software requirements. And it all goes back, how are you building your architecture because everything is big from day one in the architecture. And then it comes to detailed software design specifications. So if somebody is writing a piece of code just for the UI, yes, at the end of the day, it's a user interface. But there's a lot of usability engineering skills have to be applied. Intuitive that design is, how less confusing the design is. And in this precious system, when we are in actual hospital setting, things are moving really fast. When you're in R&D, oh, I want to click this button. I'll wait, see what's happening. I'll test my requirement. I know exactly the latency and how it going to work. But in real system, things are very different sometimes. You have lot of moving parts, You have all of these connected systems. And that's where, you have all of this emotional intelligence has to be applied when you're trying to even test things, right? How do you trust your team, how you bring innovation, and at the same time, processes, because if you don't have a mature process, that means the teams might be making decisions based on their own understanding. But as a leader, you have to really understand what are the regularity body is looking for. It depends also on what kind of devices you're working on. There are devices which are class one, class two, and class three. There are different level of risk involved. And when it comes to risk, you have to really understand the software can go wrong. In that case, how do you have risk control measures? What is low, medium, and high? And what kind architecture details are there that can safeguard against these risks? If you think about any blood infusion system or a system that has a pump. There are many things that you have to consider, like you have a speed, you have certain drug that goes to a certain level of flow, to understand like air bubbles. There are so many things that you to consider as part of risk profile. So you could perform process, risk analysis, you can perform the cyber security risk analysis. You have to also do a lot of these threat modeling to go into this direction of the software that we write nowadays is just not, a piece of library that is just written by you. It interacts with a lot of different, software of unknown providence, interacts with lot of third parties. How are you ensuring to understand, what are the risks associated with these third party software? Have you done your due diligence to, either just ask for open bugs that could exist in these external tools, if you have the access to the code, why don't you perform static analysis? Why don't we perform some kind of unit testing? And there are a lot of these other things you can do is you can more of unit and duration testing to ensure that when this package is delivered in a medical device, it's just not testing your own code but actually going beyond looking into what is the entire package.
[00:08:17] Joe Colantonio Absolutely. I guess you mentioned architecture a lot. I guess maybe, I used to do medical software and the software I was working on, literally the tech was like 30 years old. And so there was a lot of legacy systems involved. So if you don't have say in the architecture at that point, it is what it is. So how do you work around then with the risk?
[00:08:34] Priyank Soni Yeah, that's a great question. When you're working on legacy, let's say, it was a startup, you kind of like was rushing and now, let's see, now it's kind of a mature company, right? But it just still goes back to the same fundamentals. Have you evolved as a company, have you evolved as a team to kind of work, take a step back, look into if do we have robust architecture, which involves, sequence diagrams, which understands, you know, all of these component-based dependencies, like what kind of dependency we have internally, how do you basically have a framework which is written in a way that could be then decomposed into smaller units. Because if you're building a house, what do you do first? First is architecture. You basically, you have a foundation, right? If you don't have a strong foundation, then how will you build your kitchen. How will you built your basement? So it all comes down to understanding a bigger picture. What is your vertical layer? What is your horizontal layer? What kind of different components will play with each other? Do you have a robust class diagram? Do you a robust timing sequence diagrams? Because now you're talking about a lot of APIs, a lot of microservices, and now it's not like a huge monolithic code, you're moving all towards more of like smaller units. There are like pros and cons. Like, yes, you don't want a huge monolithic architecture whether you want piece by piece that can talk to each other. But at the same time, when you have a lot of these interdependencies, how do you ensure your data accuracy is there? Because end of the day, whatever you show on the UI, whatever the user sees, there's a hell of a thing that goes behind the scene. That's where as testers, we have to just not just look at the UI look at overall the bigger picture, but rather go deeper into the architecture, the design. And then basically that gives you more mature, more robust test, which is kind of like challenging the design and architecture.
[00:10:40] Joe Colantonio I'll ask this question later, but let's go to this other train of thought. In the pre-show, you mentioned something about using ISO 13485 and like all these other kinds of standards. How important are these standards and how do you comply with these standards? How do you, because I try to look into some of these standards. I'm like, I don't know. I would feed it through ChatGPT now and ask it to summarize it for me. But how do understand what they're actually asking? Because once again, when I was in medical, a lot of times they went overboard with FDA, like we have to do this, this and this. And you really didn't and they almost impeded progress because they use that as like you can't do it because we're a medical device. How do you comply with the spirit of these standards, but not get almost like shackled or kill your innovation with these standards?
[00:11:25] Priyank Soni Great question, again, when you're working with any like innovative mind, the main thing they will ask you is, I want to get this done. I know what I'm doing. I know, I trust myself, I trust my experience. But end of the day, what we really need to understand is, in medical devices, if you've done best of the best job, but if you can't show evidence, if can't meet the guidelines, if you can't meet the standards, then you know there are chances that you have to take a step back and redo things. It's better first to understand the compliance, the processes, and talking about, let's talk about IEC 62304, which is software development life cycle. It basically guides not just a testing team or a software development team, but the entire crew who's working hard on developing these innovative products. What is basically the V model? So it all starts with user needs. How do you ensure that the user need is meeting the requirements? You ensure that you have device or software validation, which ensures that you're calling your users to basically try your software and the medical device in a production level like setting. Then it comes to your detailed design requirements. These processes that always starts design and development planning, How are you? Planning your day-to-day activities, are you agile, are you like waterfall? And then it comes to design inputs. There is a lot that goes into design inputs, these are interviews that you perform, voice of customers, you try to understand competition, how you can be bettering your design inputs and then it becomes your design implementation. Again, going back to architecture, design, implementation, within implementation, ensure that you're doing enough unit testing. If you don't have these units well-tested, you will see bugs tumbling. Then it comes to okay, am I doing my job to do a thorough test of my coding standards because now what we're writing is huge. You're also not using generative AI to kind of like feed some of these inputs or kind of examples from generative AI inputting into your code. That there is a lot of these different input that comes into your design implementation. How do you ensure you have robust CI/CD pipelines? And then finally you have design verification. That's kind of more of like very formal process where you have a really clean test design, you have clean approved requirements and everything is looking kind of like, we are now into this formal phase of verification. Once the testing says, hey, I'm done with verification, I was able to meet all my system and software requirements. But still the job is not done. You still need to go through a robust design validation, which might require like a lot of pre-clinical studies, which requires people to come in like physicians, you might have end users, it could be also like your employees working side by side with physician to come and try the system to ensure that it meets the user needs. And the final end game is design transfer. How do you transfer the design to your manufacturing team? And then it's a different ball game, when you're launching your product and you need to make sure that there's a co-team, there's this team working hard to make sure that the launch process is smooth. It's all about how you're managing your risk from day one until you launch the product.
[00:15:04] Joe Colantonio If you're using these kinds of standards in this kind of environment, can you really do Agile or CI/CD or because sometimes I see companies try to implement what other companies are doing, but it's a totally different context. Are you doing like a version of Agile and then you have to do verification before it goes in production? How does that work?
[00:15:21] Priyank Soni Agile is again, like a lot of people come from, I've seen the many examples, you have people coming from consumer electronics or automotive, and then they come into devices and say, boy, this is so much documentation, so much of process, right? I need to change one line of code and it takes like four days to test. Come on, like, I want to push my code, but you have to be careful like when you are treating patient, you have to make sure that it's compliant. So Agile is a pretty broad term. And it all depends to your global compliance and processes within a certain company. I have seen examples where it's hybrid. You basically are working waterfall model, you're working through your requirements and then eventually you will test the code but later in the game. But what we are seeing now is the speed, how fast you can put your devices into market without compromising the security, the safety and the precision of the devices. The day-to-day work could be you are going into Agile with certain level of confidence that you have a pretty robust backlog, which is well-thought, which is well-reviewed to make sure that when a scrum master is going through these user stories, there's actually traceability back to a requirement. And every day, there is a lot of testing happening in many different product teams because you need to have a pretty robust cloud infrastructure because otherwise it's kind of like backbone, right, if you don't have like healthy virtual machines that are running 24/7 and it's able to like kick off the bills automatically, right. What I'm seeing is there's a lot of these like interdependencies between manual testing and automation where it goes hands-on-hands, like you need still critical thinking. And as a part of Agile, you're ensuring that each user story, each bug fix is actually going through a robust layer of testing. It could be a mix of manual testing and automation, but end of day, it's all about continuous integration. I'm actually pushing my code to the mainline continuously to making sure that it's still meeting my high level requirement. So end of the day, what happens is like you have spent, let's say, 20 sprints on testing all of the user stories, but there's one catch here. How do you show this all to your users? Because users always come sometimes and lifecycle, there are many innovative ways of doing it, right? You can have your product basically running in a virtual machine as well. Again, just for R&D purposes, just for getting the feedbacks. And you can like these people connect remotely anywhere, and give you the feedback. And that goes to the next sprint itself. I worked on this feature, the good in R&d. Okay. Physician-like or not. Is my end user giving me feedback or not? So that goes into direct into design change. And finally, when all of this, agile sprint-based testing is done, you need to pause, which basically means that a certain team goes into design verification, which is a traditional method. I got my test cases approved, looking good, everything is signed off. I'll spend like a month, let's say, running all different manual test cases, ensuring the automation is working. And then how do you basically show your traceability? And that's where I'm seeing a lot of innovation where those back in there, doing all of this Excel sheet, figuring out exporting data from one test management system, looking into some other like Jira queries. So that's very, we are seeing a whole lot of the innovation where how to ensure that when a test case is passing, it's actually traced back to a requirement. And at the end of the day, like it's just not like software requirement, and how does it meet the risk hazard analysis, the task analysis. So that's what I'm seeing a lot of these, like people like intern, they come in and they say, wow, this is a lot work, can I automate this?
[00:19:21] Joe Colantonio Can you automate it? Because like, again, back in the day, it was really tough. We had some software. We had a map to risk and it was really complicated as we had like BDD and the doing the automation. But you had to map it back to this requirement and this risk and tag things and it was really difficult. How do you handle that? Do you have any special tools because also this is a two part question. Any tools for that? And also because you're not just doing UI testing, are there any other types of tools that you use that people may not be so familiar with?
[00:19:52] Priyank Soni Yeah, let's talk about like, how do you connect different systems? Yeah. It all goes back to like, do we have a pretty robust SDK for that tool? Are they exposing some APIs? How do you get the data in a safe way? Because end of the day, there are like these systems where you have your IP. You have a lot of information. Many times we have seen like teams, they come up with their own sort of like framework. I've seen teams developing their own back-end, nodejs application or microservices, and understood all of the APIs, whether it's coming from Jira, whether it is coming from JAMA. How do you basically connect these dots? Because there's also a lot of this trust within like, how do you trust some of the cloud-based technologies where do you have enough NTA signed? Do you know exactly if it is on-prem data? Is it beyond the boundaries, right? all of these limitations actually gives more insight into can we innovate? Can we actually develop on our own? And I have seen many instances where somebody said, you know what? I'll just develop a pattern-based framework and give it a try. Give it to another team that can actually put into a local self-hosted cloud where you have a dashboard, you can see data flowing, you can actually create different releases. And actually show the whole traceability from your user need to your requirements to the architecture because a lot of these systems, off-the-shelf system will not understand the entire tool chain set. Because in many times in a medical device, there's a lot things that you do which is not coming through the tool. Like for example, if you're working on a design equivalency, right? You might be writing a lot of things on your own, right, that's not coming out of tool. Like you might be doing some critical thinking, writing. Do you know how did you do qualify a certain equivalency. That is all kind of like disconnected offline systems, which I'm seeing, you can use now generative AI to summarize things, but end of the day, it is still a summary. It's not an in-depth information of what went into like point A to point B.
[00:22:07] Joe Colantonio You did mention AI and machine learning. You've been doing this for a while, how much has AI and machine learning changed what you do day to day? Or is this still in health care maybe because you have to follow these best practices and standards, you can't really consume AI ML the way you would want to maybe.
[00:22:24] Priyank Soni Again, AI is a complicated topic, especially in medical devices. If you look at FDA, right? They're supporting, but they're cautious. They're ensuring that people understand, is it audit ready? If you have worked on creating let's say computer vision model, that can understand your screens. It can do object classification. Have you thought about your data really well? What is the source of your data? What is the variance of the data. And then it comes to, I'm giving example of computer vision because that is something I can share my insight. And then, it comes so how do you do annotations? Are you trying to like put into like human and loop that can actually approve and project some of the annotations which are actually generated on its own? Do you have audit ready? Do you a version control of your dataset? Because many times when you're trying to train a model or something which is still in proof of concept, but it should be audit ready. It should be transparent. There should be a lot of insight into how did the AI came to a conclusion? Is it a proof? Is this evidence that you can show to these regulatory bodies? And again, yes, maybe people will not understand the heuristic graphs and data analytics that is happening behind the scene, but if the framework can preserve all of this data that is happening behind the scenes, all the data analysis, all how does the algorithm works, right? Because these evidence matters a lot. Just showing, okay, I was able to do object classification. I found my object. Good. Show me the proof. How did you ensure your data set is version controlled? If you're using a platform, does it record all of the analysis that was performed, the actual proof of concept, whether it was training, whether it was validation. And then when it comes to showing the data, how easily a regulator can understand your report. That also matters a lot. How do you ensure that there is an easy flow of information in your reports? Because many times, when these regulators work on these approvals, they might be looking at some kind of equivalency between a traditional way of testing, right? Okay, you've got the report. You have evidence of your requirements. You have every evidence of traceability. And then now when you feed in something which is tested through AI, does it have the correlation of how you have performed your traditional testing? And does it show me the evidence or not?
[00:25:01] Joe Colantonio I think you just gave me an idea for a business. Maybe you've already thought of it. Maybe it's already out there. Maybe create an AI FDA auditor or AI regulator that you can run against your software and see does it match all the regulations. Is there anything like that? Because I've been hearing a lot about how AI is going to replace everyone in 12 to 24 months, but anytime I see a demo with automation and AI tools, it's always like a web browser, right? It's not like a medical device where all these systems you've talked about, they're not really connected or they are connected in weird ways. Where do you see the future of AI and do you see something like an AI FDA auditor or something like that to help?
[00:25:35] Priyank Soni I don't know if it is on a horizon, but again, it's a very tricky and goes back to the trust and what kind of false positive can happen when you're doing an AI-based audit, right? Because the systems that are still ongoing under development are still evolving. There are a lot of pieces which are done traditionally, there are nuances, new way of doing things. And when it comes to auditing things, you have to go back to the basics, have team followed, you know IEC 6234, you now IEC standards regarding 60601, which is for the electrical safety. So again, like it depends if you have that kind of first system that can give you a basic input to summarize things, which is, could be a starting point, but it doesn't give you all the intricacies that go behind the scene, which you have to dig into it, right? You have to really kind of like gel and understand, again, going back to the user need, going back through the architecture and, and looking at the evidences. It could be a possibility in future, but at this moment, I would say like it's still early in the phases where there is no common platform. There are no common, I will say, standard that people are following. And until we reach to a point there, we have a certain standard that guides automated audits, that could be something that gonna guide these platforms and tools where you have a common set of requirements to do these audits. By this point, audit is definitely much more in-depth that you have to do, and you have also ensure that what is the transparency in your system.
[00:27:27] Joe Colantonio All right. That's kind of an odd question. This is the season where a lot of people are graduating high school and college. Do you still recommend software testing as a viable career, especially knowing AI machine learning is about to really take off?
[00:27:40] Priyank Soni Definitely. I think you need more testers now. It's much more than what we were thinking before, because the world is changing at a very fast pace where the traditional methodologies will still be there. There'll be more innovation. There will be more of software coders using generative AI, more of large language models that is giving summarized results, explaining the code, and people are actually moving in this direction taking help, not copy paste, I would say, but taking help from these AI assistant, which puts a lot of pressure on testers, like how do you ensure a code actually written by a human plus some of the AI assistant is actually working? That's where you need to still expand the critical thinking because you need this, at the moment respond that you can understand from the system. So you need know like if the system is still performing based on our user needs, or it's still performing based on what was the expectation, is it still safe? Is it still meeting all of the regulatory compliance regulations or not? And then it comes to automation. Of course, like automation has been there for like many, many years, but the transformation we are seeing now is something which has to be careful. Instances where, the reports that are being generated from automation is sometimes could be huge. It could be thousands of test cases that you have to analyze in a short moment of time. Because when you're in a Agile, you might be under pressure. Hey, give me the results by tomorrow. I want to know if I can check in my code. Now, if you're trying to use a generative AI model to summarize your report, yes, it can give you a high level information, but it will not tell you exactly, you now, what happened into my new details. You need all sorts of tools with caution. Going back to mature processes, have you defined your process? Do you have your risk controls? Do you know how do you produce an audit ready report or not?
[00:29:44] Joe Colantonio Excellent. Okay, before we go, is there one piece of actionable advice you can give to someone to help them with their automation testing or testing medical device efforts at all and the best way to find or contact you.
[00:29:55] Priyank Soni I would say like, be honest, lead with empathy, be curious, look around like what's happening, read a lot, right? Understand the mindset of a developer. The more you understand how a developer is trying to understand the requirement. Are you doing your one-on-ones with your different cross-functional groups? It could be quality, it could be regulatory, it could a system engineering team. So the tester is basically a dynamic individual that is also, you know, not just like a person who is coding or meeting the requirements, but it's more about the communication, how you communicate well within R&D, within marketing, and also like many instances where you have to interact with your end customers. It's a full package. I would say tester is a full package with a lot of versatile skills and emotional intelligence. And if you have reach out to me, I'm on LinkedIn. You can shoot me a message.
[00:30:53] Joe Colantonio And we'll have a link to all this in the comments down below.
[00:30:57] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a549. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.
[00:31:48] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.
[00:32:14] Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.