QA Agents of Change with Lisette Zounon

By Test Guild
  • Share:
Join the Guild for FREE
Lisette ZounonTestGuild AutomationFeature

About This Episode:

Want to know how to become a QA agent of change? In this episode. Lisette Zounon, a highly experienced quality engineering leader with over 18 years in the industry, shares her knowledge and experience implementing successful QA and testing practices at multiple companies. Discover Lisette's background, her experiences leading a diverse, distributed QA team, how to build a QA organization from the ground up, QA Automation Strategies, and some exciting new developments in the world of QA automation. Listen up!

Exclusive Sponsor

The Test Guild Automation Podcast is sponsored by the fantastic folks at Sauce Labs. Try it for free today!

About Lisette Zounon

Lisette ZOUNON

QA Automation Strategies,
QA metrics to measure ROI
Leading a diverse distributed QA team
Building QA organization from the ground up

Lisette Zounon is a passionate quality engineering leader with over 18 years of experience helping people and companies improve the quality of their applications, with solid tools, a simple process, and a smart team. She strongly believes that industry best practices including implementing agile methodologies and DevOps practices are invaluable.

Lisette was responsible for leading and managing high-performing quality-testing teams throughout all phases of the software development testing cycle; ensuring that all information systems, products, and services meet or exceed organization and industry quality standards as well as end-users' requirements. This includes establishing and maintaining the QA strategy, processes, platforms, and resources needed to deliver 24×7 operationally critical solutions for many of the world's largest companies.

Lisette is a proven leader who thrives in a highly technical software development environment. Quality should be at the very core of the software process and not be integrated only once the coding is completed.
 She is a constant champion of employing best practices for QA, agile methodologies, and Scrum implementation to think differently about work and increase your team's and customers' happiness.

Lisette has worked extensively with Fortune 100 technology companies like 토토사이트 순위, as well as startups, and not for profits in e-commerce, cloud services, telecommunications, finances, and oil and gas industries. In 2019, she launches ZSI, a technology service firm that is focusing on helping companies delivering high-quality software using simple processes, solid tools, and a smart team.

Connect with Lisette ZOUNON

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice for some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:25] Hey, it's Joe, and welcome to another episode of the Test Guild Automation Podcast. Today, we'll talk with Lissette all about QA, Automation, and probably robotics AI. Who knows? She has a lot of experience. We'll be diving deep. If you don't know, Lissette is a passionate quality engineer leader. She has over 18 years of experience helping people and companies improve the quality of their applications with solid tools. She has a simple process she follows, and she also knows how to create a really smart team around her. She's also responsible for leading and managing high-performing quality testing teams throughout all phases of the software development testing cycle, ensuring that all the information systems, products, and services meet or exceed organizations and industry quality standards, as well as end-user requirements, which we know is so important nowadays. This includes establishing and maintaining the QA strategy process, platforms, and resources needed to deliver 24/7 Operation Critical Solutions for many of the world's largest companies. I met her at StarWest, she told me all about the awesome things she's doing there. She's also a proven leader that thrives in a high-quality, high tactical software development environment. Really excited to have her on the show today. You don't want to miss it. Check it out.

[00:01:35] The Test Guild Automation Podcast is sponsored by the fantastic folks at SauceLabs, their cloud-based test platform helps ensure you can develop with confidence at every step from code to deployment, to every framework, browser, OS, mobile device, and API. Get a free trial. Visit and click on the exclusive sponsor's section to try it for free today. Check it out.

[00:02:04] Joe Colantonio Hey, Lissette, welcome to the Guild.

[00:02:11] Lisette ZOUNON Welcome. Thank you for having me. So excited to be here. I've been a listener. I was excited to see you at StarWest. And I'm happy that we've made it happen today in 2023.

[00:02:22] Joe Colantonio Absolutely. Yeah. I'm really excited. After I met you, I'm like, I have to get you on the show because it sounds like you're really doing a lot of cool things there, so that maybe that will start with that. It sounds like you're working in the robotics industry, so maybe you could talk maybe a little bit what your current testing needs are and maybe some challenges that you're dealing with that maybe other people aren't, especially when it comes to maybe robotics.

[00:02:42] Lisette ZOUNON Yeah, challenge, definitely. And I love challenges and that's why I took on the role. I've been at FORT Robotics for about six months now. And what we do is what attracted me to this role and the challenge that we have is really serving our robotic and autonomous machine customers. So we have a set of tools that are really a combination of hardware, firmware, and also software and cloud-based applications and eventually in the future mobile application. You can see that we're hitting all these spots and my team is responsible of managing and leading all the QA testing and quality effort in that organization.

[00:03:20] Joe Colantonio Very cool. So you're known for process implementing processes, so stressing all like you're doing what? firmware? It seems like you're dealing with a lot of moving parts. How do you implement a QA automation strategy, your process within this company?

[00:03:34] Lisette ZOUNON Yes, So that's the challenge. And I was attracted to it because in the past, I've walked in a hardware company just dealing with hardware alone. Cloud-based company. And then software. So this is the first time that I'm in a startup where all of these are a good challenge to have. And the way I'm approaching it is really from one area to another, while I'm also keeping a consistent process across the organization. For me, it's really about simplicity. Simplicity, having a simple process, understanding each team's needs, and trying to meet them where they are so that we can solve the challenge. Let it be the hardware to let it be the firmware, but also be the one that allows each team to cross-collaborate. One of my favorite works this year is cross-pollination so that every team is collaborating because I feel like the more we collaborate, in our test strategy through our automation, we are able to find those edge case bugs that really kind of like drive us nuts toward the of our implementation. And that's exactly where we are right now in our challenge.

[00:04:41] Joe Colantonio How do you do that cross-pollination then? Because I know let's try to get one together on the same page is difficult. How do you get people to actually contribute and say, Oh, get, get kind of excited about this?

[00:04:51] Lisette ZOUNON Yeah, that is a challenge totally and we love to do that. But what I've implemented, I've implemented this in the past. I usually like to call QA folks. I see QA folks few people that way that hard as an agent of change. That's really what they're doing on a daily basis. And I like to call them FBI agents of change sometimes because if you are the FBI, you have a little bit of the investigation trying to really crawl and find out what's the issue. I'm using my team to be at the center of that between Dev, DevOps, Automation engineer, and all the need to be able to really find what is the issue and then how do we add that to our real test suit so that not only are we thinking out the problem of today, but how do we resolve it so that we get ahead of the issue because as the process goes on, I'm always trying to sit and understand the challenge, the pain point of that, the here and now, and how do we add that into the loop so that we get ahead of it early in the cycle next time. Because for a startup, there's a really fast change happening all the time. So you really have to be cognizant of that, but also make sure that when you start a new process, you can work in it. I mean, work with the product team, working with a developer which has various types of developers. We have a cloud developer, we have a firmware developer, embedded system developer, and each of these people has a specialty. And QA across all of them. So making sure that we really go and understand each of their challenges, resolve that, but make sure that it's consistent across so that what we learn from one team, we can also apply in another team because everybody does not have the same challenge at the same time and every team at the different maturity level as well because that's another thing that is cognizant for me. When we learned one challenge, I tried to document it. I've tried to keep track of it so that as long as I know the team has it, we can solve it for them either in the same way or in different ways.

[00:06:52] Joe Colantonio Yeah, we take the challenges around firmware much different than the cloud team.

[00:06:55] Lisette ZOUNON Yes.

[00:06:57] Joe Colantonio How much automation do you put into place and how do you know, that automation can be applied to firmware? Because it's not like you're using Selenium or gets a browser application. How do you know or I know automation is helpful and it has to be able to be helpful in firmware, but maybe some teams don't know how to do that. How do you know what's automatable, I guess, or how you can help teams?

[00:07:18] Lisette ZOUNON Yeah, that's a good question. One is to really understand what they're doing at a very granular level. And in my mind, I just had that conversation today with my team I have this funnel in my head. It's like anything that is close to the code is really easy to automate. From a firmware perspective, Unit testing. This year we invest or last year we've invested a lot in our Unit testing because those were like the low-hanging fruit and we see an effort being that really helped us a lot. Now the next stage of that is what kind of integration not far from the firmware that we can do. And sometimes the hardware can become like a super black box to us. So we really trying to like chip away at it from the middle layer as much as possible. If you think of the hardware being lower, the next middle layer and it's the same thing also on the cloud team, trying to chip away at it from the API layer as much as possible and see how many tests can we focus on. And then the other area that could be a challenge or that are much adapt to flux because hardware can be a huge problem. We have to make changes in our hardware sometimes frequently, sometimes because of a bug that happened in manufacturing. So we leave that alone. But the next layer is where our focus is that we can run repeatable tests so we can easily identify issues. Those are kind of like the ones that I have to lean in with each team and understand how we can have them.

[00:08:46] Joe Colantonio So that's interesting. Also, that's probably another challenge dealing and testing against hardware. So it's not like you can go against a browser and just spin them up. And how do you handle that then? Do you have bottlenecks where you can only do so much testing because you're tied to maybe an environment that has the hardware you need and you can't just easily replicate it?

[00:09:04] Lisette ZOUNON Yeah. That's a good question. We use simulators some places, a couple of our team used simulators so but even we use simulators that's just in doing testing. Sometime we use that as the early testing methodology just to get a sense of where we are, just to even understand how our test is solid. But then at the end tests in the true value of test and just not to draw another range, another challenge that my team is focusing on is really in an organization is safety and security. We have a lot of safety standards that have to follow. Our final tests to meet our safety certification and all that also have to run in the real hardware. And the latest version is possible before we can set for ourselves as well. So there's a really granular level that we have to go through that. We would use some simulators as well as early tests to just get a sense of how tests are just testing our own test pretty much with the simulator. And then we run it on the real hardware as well. And based on the hardware we visual, we've got to keep pace of that too, because if our hardware team makes a change in a hardware revision, we got to make sure that we also run our later tests on those. A lot of moving parts.

[00:10:23] Joe Colantonio And then you add security on top of it. You say, QA, it's like QA agents of change and they go across all across the different verticals, I guess. Are they responsible for everything? Like do you expect your QA person to actually know security or do you have people embedded that experts in it? Like how does that work? Because it sounds like you're trying to cover a lot of different things.

[00:10:41] Lisette ZOUNON Yes, yes, yes. So, well, we hire I mean, that's a good question because that's a challenge that I was facing last year when I was hired because, for us, security and safety is our product. That's part of our product. So we have some security experts, developers, and architects that are really architecting how security should be built into our hardware and our firmware. And also, in our cloud application throughout the whole product pipeline. From a QA perspective, you don't have to know security. But if you know security is a good thing, we look at it as part of the product. So you just go, you need to understand the requirement and you testing for security. And it's a little bit more granular for us because it is like a secure boot. A hardware need to be secure boot, we need to have secure updates. So those are like functionality that we are selling out there and these are part of our functionality. We need to understand that. We are building a team that would be world-class on that on this in cybersecurity and also safety as well.

[00:11:47] Joe Colantonio How do you know you're doing the right things then? And do you have any metrics you used to know, with security we're doing this? I have a like how do you keep track of all that stuff?

[00:11:55] Lisette ZOUNON Yes. So I'm a huge fan of Dashboard and we are definitely a work in progress. Dashboard is like an area of my responsibility, actually, because I like Dashboard. I like to know what is KPIs. Are we meeting the KPI? So each of our products has some KPIs they need to meet. We also have KPI for the software team. We have KPI for the quality team and we have KPI for the hardware team, even how our customer, I mean the manufacturer produces the hardware that comes into our organization or even that goes to our customer. We need KPI across all of that. Right now we are defining the KPI. And then next we try to make sure that we have a baseline and then try to improve on those data that we collect from each release.

[00:12:44] Joe Colantonio All right. So it sounds like you have a team also that is distributed. So is everyone in-house or are they all over the world? How do you deal with not only all this technology, but it sounds like you have resources all over the world as well?

[00:12:56] Lisette ZOUNON Yes, it's not a huge organization, but it's a very efficient team. And we are definitely all over the world. It's a mixed bag of in-house. Most of our hardware folks in-house in Philadelphia, in the lab on hands. And then we have teams that are across in the United States and then we both come in the lab as needed. So it's also that mixed bag that we have totally.

[00:13:20] Joe Colantonio So how did you set that up? Because I believe you built this organization, it was the start-up you helped build it from the ground up, but you have experience doing that. How do you know when it makes sense to have someone in-house as opposed to this makes sense to even to tell management, Look, this person is great. They're not going to be I know that they're distributed, but they could still work remotely, get the job done.

[00:13:39] Lisette ZOUNON Yeah, I love that question because that was something that I had to like finally clearly define, you, and although I've done this in multiple organizations in the past, I've been working with distributed team for the past 10 to 11 years, actually, 12 years since I was at Yahoo. So that is now I've used to work with teams that are geographically diverse in India, California, Texas, and also in New Zealand. I had like all those times so I'm kind of used to that, but it just required a little bit more of like flexibility and agility among the team and a lot of collaboration. But to your point, what I have found out and have decide and learn on as we try is the most experienced team member can be remote, meaning people that have like a lot of experience, they're more independent, they're more flexible, they're more agile. Those are the senior role senior engineer they can be remote and it has worked so far because they are more responsible, for lack of a better word, they have a lot of accountability and then they travel as a native to a headquarter in Philadelphia. The folks that are a little bit more, I would say, junior or less experienced even with the tool and technology and even our hardware, we learn for them to be local because then they get to like learn a lot more about hardware and play with it because we're all like little kids that love to play with those hardwares. If they're more local or really close to the headquarter, they can go to the office as often as possible because you learn by doing a lot more and then you can collaborate with other folks in the lab. So that's the combination that has worked for me so far.

[00:15:17] Joe Colantonio How do you get people onboarded? Do you have a training system in place to help them? Like I said, it doesn't seem like a standard web browser-type situation. So how do you know? Like how does your onboarding then that you're planning?

[00:15:29] Lisette ZOUNON Yes, that was also another area that I had to tackle last year because, I mean, hiring takes time. You have to go through all that process of finding the right talent, which was not easy because you can see all the combinations that you need. If they have security, Yes. If they have safety, that's even great. They need to have embedded. So that was a good challenge that we well, boy, we have some great folks that joined the team. And then the next plan for me was that plan a great onboarding for them. And for me, I feel like onboarding is probably never-ending. But the great thing that we have at FORT Robotics, we have an excellent team that is really great at collaboration. For me is that we're documenting our onboarding. The first I would say the first month is really intensive meeting with various folks and just having a detailed conversation about what they do. So the team on this and how they feeding in the whole landscape from the product all the way to our integration team, meaning like our operation production team. They have to meet with all those folks on a regular basis. We have a roadmap conversation so they know what product they're working on, what are the features of those products, and then get into like tools and technology that they need to learn, there are really different layers for that. So that it is like a long month and then that that doesn't stop. I do a continuous conversation with them. We create we have a customer team, a fabulous they have great customer training because you have to learn about another part of our tool is its use in a variety of organizations. We have agriculture, we have telecommunication, meaning like autonomous driving, we have manufacturing, and we have construction. We have like a wide range of industry. And each industry has a different use case as well. So our sales and customer team has done a great job like creating various training. And when you go through those training, you really give you a great sense of like, what is that product, how they use that product out there? Because I feel like QA folks also need to have that lens to understand how because sometimes you can get too much in the weed of your daily testing and you lose sight of actually the customer part of it. That training also happened is part of the onboarding, and that is a long process as well. And we're still refining our onboarding. But so far we getting feedback from these folks so we can refining so that we have future of hire. We can also make sure it works well for them. But onboarding for me is critical.

[00:18:05] Joe Colantonio These are questions just coming into my head. How about tooling? Because like once again, it seems like are you creating your own tools? Are you purchasing tools? Are they open source? How does that work?

[00:18:15] Lisette ZOUNON It's a mixed bag. And I have to tell you, that tool is something that we are focusing more on this year because we're really trying to get our product complete and ready in the market for general availability because most of our product has been used by early adopter, early user to give us feedback. We are really focusing on having a general product out there. You talking about an internal tool for us, right? We have a couple of tools, but they're not consistent across team as we want to. We use Pytest. For some automation, we've used a Cypress. I just had talked to my team today that we might dabble into a little bit of Selenium and even Playwright. We really trying to I'm personally somebody that is super tool agnostic because I've used so many tools in my almost two-decade career that, I don't get really married to a tool much. I have the whole speech that I do about picking your tool is almost like dating. I'm really open and tool-agnostic. I actually let the team select the tools that work for them. And just tell me what. So this year, but we're going to get a little bit more cleaned up and aligned on our tool across the organization because that's also important for us as we growing, expanding, and scaling. We want to really get that consistent as well.

[00:19:35] Joe Colantonio I love that approach. Don't fall in love with tools. Use the tool that's right for you and your team. If it works for you, great. If it doesn't, move on.

[00:19:41] Lisette ZOUNON Yes.

[00:19:42] Joe Colantonio For sure. At StarWest, I think you were on a panel on AI in testing with Tariq. Does AI come into play at all in your current job? Like, do you actually use AI in testing? Where does the AI piece come in or is it just something you got from another company?

[00:19:56] Lisette ZOUNON Another company, Yes. So not yet at FORT Robotics, but I would like to bring it. But I will say, Yes. You actually recall. I was on the panel to talk about AI because in since 2016, I've been involved. I was working at the organization where we had a lot of testing to be done with a very small team again. I was really trying to find the right solution. And again, me being somebody that is a super tool agnostic, I went out of the marketplace and I was an early adopter of an AI-based tool that my team successfully. I have that case study that I really shared with the conference about how we had like almost 1,000 test cases and the process that we went through using an AI-based tool with a team that was not automation engineer from the Strat. They learn as the tool allowed them to become automation engineers in the span of less than six months and how we were able to like reduce our efficiency, I mean, actually, increase our efficiency by reducing the time we was spending going from like 3 hours, 1 hour regression, I mean no, three days, one in regression to like 3 hours. So that was a huge efficiency. So from that moment, I got in, I was in love with AI I've been involved with AI from gradual school way back then, and I've always been a fan of understanding how we can use AI-based tests to have QA. That is something that I'm passionate about. I'm actually starting a course in a few weeks, in February, in two weeks, actually just teaching at quality and then artificial intelligence, because I feel like there's a good correlation. I just have a lot of challenges right now in my third organization before we can introduce that because we need to get a strong baseline into how we automate things. Before we get to that next layer.

[00:21:54] Joe Colantonio I need to go back on the show dedicated just to that. I guess at a high level then what should someone as a test to know need to know about AI is just they need to know like do they really need, again, the guts of it? Like high level? If there was a syllabus, what would they need to know about AI in order to be successful with it in testing, if anything?

[00:22:12] Lisette ZOUNON Yeah. For me, I feel like depending of the level because you ask the right question. How deep do you want to get into it? The team that I was and I do talk about that in my talk, it's all depend. You don't want to get deep like me or deep like my friend Tariq level. But some of us are just geeking out about it a lot more. But depending on your interest, just my talk is usually about be aware that it can help you. You know and I have refined my talk of from 2019 to even now 2023 because back in 2019 people are still a little bit reluctant when you say AI and then you think QA because QA was such a role where it's already fragile. And we know that there are a lot of layoffs happening in 2023. And QA is it already a role that is fragile. So when you bring AI into a QA organization, people are wondering are you trying to reduce the cost? Are you trying to reduce people? And so my talk was always wrong. Use it as something that's going to help you. And now 2023 we have ChatGPT out there. I see you talking about it. It's happening. We've been telling you guys is going to happen since maybe 2018. 2023 it happened. It has happened. So in 2023, how can you leverage AI in any of your role? Not even QA anymore. Everybody needs to look at how they can make their job better and it's not replacing you. We've already had that conversation so many time now, and I actually saw something funny last night, I think Pfizer company wrote an open letter to AI and say, let's work together from a human perspective in the AI. That's the conversation. Let's see how we can use AI and 3 low-hanging fruit that I knew that it was giving you. There are so many tools. Back in 2016, there was only one or two vendor that was like, they're really starting with it. Now, when you go to like a StarWest conference that everybody maybe last four or even two four, they go now have AI in the tool now. What level are they giving you? We don't know, you have to do a homework. In my presentation, I tell them what they need to ask based on what they need. But for me, the visual testing is great. You can do visual testing for you. My favorite one is self-healing because I suffered when I was a QA automation engineer coming back in the middle of the night or the next day just to fix my code to figure out why the test failed. I spent the whole afternoon trying to figure that out. The self-healing is something that I'll personally love. And the next one that I really want to even understand a little bit more about it is the test creation part. We're pushing the boundary, let's see how far we can go in our strategy. So if you can create the test for us now and we can refine it with our human eyes or with our human brain based on the requirement, why not? Anything we can do. So just be efficient so that as a tester as a QA person, you can actually do the job that you were hired to do, which is bring quality to the organization so you can spend time in those meetings and cross-pollinate and do those integration things, do the boundary tests and do the test that nobody think about. Those are the what AI will help you for.

[00:25:40] Joe Colantonio I love it. And I assume you've been using ChatGPT?

[00:25:45] Lisette ZOUNON A little bit. I'm skeptical. I'm like.

[00:25:48] Joe Colantonio Yeah.

[00:25:48] Lisette ZOUNON I'm a skeptical person. So yeah, a little bit. A little bit. I'm making my daughter try to just play with me. For me, I look at it as a bigger Google, really. And there's a book that I read a long, long time ago. I forget the title, but it's a sci-fi book. I was not a fan of the book, the book really mentioned a lot about this. For me, I'm like, we are there. And it was in college about 25 years ago and we were talking about everything that AI can do, and that's what hooked me into AI. And I feel like now, 25 years later, we are finally there. As of January 2023, we really close. I'm really looking forward to the opportunity. I just don't want I feel like people hype it and it be way too much. And some of us that are really close to we can look at it and see what it can help us do. But I think we are the marketing is a little bit louder than the actual, what it can do. That's my personal opinion right now. I know you have like I saw and I did sign up for it. I was like, I want to hear what this guy has to say because I have not spent time to actually do the due diligence and see how people can use it. But I'm curious.

[00:27:00] Joe Colantonio That's pretty wild. I mean, for what I've seen, that's not even the only early version I heard they're coming out with even more powerful version coming out in a few months. I was pretty impressed what it can do. And like you said, it's almost like a better Google.

[00:27:12] Lisette ZOUNON Yes.

[00:27:13] Joe Colantonio And it could do all the things like you mentioned, it could write like a selenium test for you. It can do all these things for you. But like you said, it's not going to replace you because you still need to be able to look at and say, review it, modify it. It's just going to help you, I think, to do your job better and quicker.

[00:27:30] Lisette ZOUNON Be more efficient at your job.

[00:27:32] Joe Colantonio Yes, exactly. Yes. I love how you're saying don't be afraid of it. You need to embrace it. Learn it because it is coming. It is here.

[00:27:39] Lisette ZOUNON It is here.

[00:27:40] Joe Colantonio It is here. ChatGPT. I don't know. I was really blown away. I know a lot of people are like, oh, it's not. It's a toy. But I just really see it as really powerful. Getting even more powerful, for sure.

[00:27:50] Lisette ZOUNON Yeah. Even when I've been paying more attention, interesting has been marketing. For some reason, I have a couple of people in marketing my feed, and then they're like talking about it really aggressively. I'm like, okay, I want to know. And everybody should do that exercise because that's what got me interested in AI in QA because I know automation was something that we have to, it's a problem to solve in QA. It has been a problem for the last decade for most teams that I've been involved with. For me, it was a new quest of finding a solution. That's how AI was one of the solutions in my toolbox. If there's another one, I'll use the ChatGPT, I'll use anything that I can do to solve for the automation challenge that my team is facing. I'm all for it.

[00:28:36] Joe Colantonio I love it. I love your attitude. Awesome. Before we go, is there one piece of actual advice you can give to someone to help them with their testing efforts? And what's the best way to find or contact you?

[00:28:46] Lisette ZOUNON Oh, a good advice. I think you just.

[00:28:50] Stay curious. That's what I always say. Stay curious and that's how we like when you go in the requirement meeting and you listen to them don't just think, oh, I'm just a QA person, so I don't care about the requirement. Yes, you do. Be curious and ask all the right questions because the best QA folks that actually are subject matter experts for the product. And I've been one in the past. You are like the go-to person. I've been in a team where the new person joins in they send it to me and then I train the person based on my test cases because I'm close to everything. Back to that cross-pollination. I'm talking to everybody in the organization, all the different components. So it's the curiosity that drive you to get that level of detail on all this thing. And so stay curious. When you're talking to the developer, stay curious as well. Ask them about the code, and why they're making this decision. Because when you're testing, you need to understand why those decisions will be made in the algorithm or the way they implement it. Stay curious for me is like the golden rule. And that's actually, the reason why I love my job because your curiosity makes you want to continue to learn.

[00:29:54] And where to contact me. I think LinkedIn is the best way. I am not a Twitter person. I started to stay away from Twitter. But I think LinkedIn is the best way.

[00:30:46] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to and while you're there make sure to click on the try it for free today link under the exclusive sponsor's section to learn all about SauceLab's awesome products and services. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:30:46] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to And let's make it happen.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Testguild devops news show.

Browser Conference, OpenSource LLM Testing, Up-skill Test AI, and more TGNS125

Posted on 06/17/2024

About This Episode: What free must attend the vendor agnostic Browser Automation Conference ...

Harpreet Singh-TestGuild_DevOps-Toolchain

DevOps Crime Scenes: Using AI-Driven Failure Diagnostics with Harpreet Singh

Posted on 06/12/2024

About this DevOps Toolchain Episode: Today, we have a special guest, Harpreet Singh, ...

A podcast banner featuring a host for the "testguild devops news show" discussing weekly topics on devops, automation, performance, security, and testing.

AI-Powered Salesforce Testing, Shocking Agile Failure Rates, and More TGNS124

Posted on 06/10/2024

About This Episode: What automation tool just announced a new AI-driven solution for ...