The Four Phases of Automation Testing Mastery with Jon Robinson

By Test Guild
  • Share:
Join the Guild for FREE
Jon Robinson Testguild Automation Feature Guest

About This Episode:

Welcome to the TestGuild Automation Podcast! I'm your host, Joe Colantonio, and Jon Robinson, chief storyteller at Nekst IT, ready to delve deep into the automation testing world. In today's episode, “The Four Phases of Automation Testing Mastery,” we'll debunk the myth that automation is a cure-all solution and explore the intricacies and careful planning needed to succeed in test automation.

Join us as we discuss practical strategic approaches including the 4-phase model—Discovery, Design, Develop, Deliver—and the importance of a maturity model to ensure your automation aligns with CI/CD integration. We'll highlight how automation serves as the backbone for regression testing, providing substantial long-term benefits and how pushing it later in the development process can minimize rework and costs.

We'll tackle the challenges of test management in an agile world, and Jon will share his insights on the importance of storytelling in QA and how it can revolutionize how we test and communicate the value of our work. Expect actionable tips on avoiding common pitfalls and why focusing on real-world impacts and user perspectives can significantly improve your automation efforts.

Prepare to elevate your testing strategy and learn why quality should be the focus rather than just hitting metrics. Tune in as we explore practical insights and real-life experiences that will empower you to enhance the success of your automation testing projects!

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Jon Robinson

Jon Robinson Testguild Automation

Jon has helped build and lead a wide variety of teams across all aspects of QA, CX, and Marketing over the past 18 years, and currently runs Irrational Consulting, which specializes in helping organizations improve their overall Quality. He is the former Chief Storyteller @ Provar, and has worked with organizations like Nissan, ScrippsNetworks, and Victoria’s Secret to battle test his methods. From transforming QA processes at The QA Consultancy to building high performing teams at HomeAdvisor and MEDHOST, Jon's track record speaks volumes about his knack for quality and customer-centric approaches.

Connect with Jon Robinson

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:24] Hey. It's Joe, and welcome to another episode of the Test Guild Automation podcast. And today, we'll be talking all about preparing for automation testing success and a bunch more with Jon Robinson. And if you don't know, Jon has helped build and lead a ton of different teams across all aspects of QA, CX, and marketing over the past 18 years and he's currently the Chief Storyteller for Nekst IT, which specializes in helping organizations improve their overall quality. He is the former Chief Storyteller at Provar and he has worked with organizations like Nissan, Scripps Networks, and Victoria's Secrets to battle test his methods. He really knows his stuffs. If you don't know also, John gave an awesome workshop at this year's Automation Guild. I think it was the highest rated sessions or he was tied with someone else and that, hey, let's get him on the podcast and pick his brain more about how to prepare for automation testing a bunch more. You don't wanna miss this episode, check it out.

[00:01:19] Joe Colantonio Hey, Jon, welcome to the Guild.

[00:01:24] Jon Robinson Hey, Joe. Thanks for having me man.

[00:01:25] Joe Colantonio Awesome to have you. Hey, what the heck is Chief storyteller?

[00:01:29] Jon Robinson You were not the first person to ask that question, obviously. I got the idea from Microsoft a number of years ago, and it came about because the one thing when I was at Provar, we really struggled to figure out how, as you know, testing, QA, test automation, it's a very hard topic to describe to people to take to market and kind of explain what it is you do. And so the thing that we kept coming back to was, how do we tell our story? How do we tell the story about what we provide value wise and how we help people, etc., in looking and saying, storyteller seems like a really, really good idea. And I started looking into it and it really kind of fit with what we were trying to accomplish, which was how do we kind of have a singular go to market message that everybody can relate to and applies to the entire organization. And it works really well. At home too, my kids like to call the chief liar. So I kind of just stick with it now, but it does. I like to tell stories. Those that know me know I like to talk a lot, but I like to wrap that into something that's fun and engaging and interesting.

[00:02:38] Joe Colantonio All right, already off the script. So if I'm a tester, it seems like this would be a skill that testers should be good at because they sometimes are overlooked. So how does a tester implement this?

[00:02:48] Jon Robinson I would absolutely, this is the one thing that probably relates more than anything. I talked to a friend of mine right after we kind of came up with the title, and he said, it's the one thing that really hit home for him because in a lot of cases, QA people and testers often have to tell a story about what is the impact of this thing or how do we effectively test this thing. And sometimes user stories aren't enough, right? It's sometimes you have to really put yourself into the persona of the person you're acting as from a test perspective to really, truly understand what it is you're trying to accomplish. And so it is a skill that I think a lot of testers should do more of, but I think a lot of them are really good at it already, and it's really just recognizing that a lot of the things that they're trying to do is tell a story. And in the best way to communicate things sometimes is through story.

[00:03:50] Joe Colantonio How do we learn to tell stories? Some in the retro and I want to highlight all the things automation is done. How do I do that?

[00:03:56] Jon Robinson A lot of it is starting with in a story, you have the opening, you have the the set up, you have the core plot, and you have the resolution. And it's really taking a lot of those same elements and applying them to what you're trying to accomplish. So in the case of an automation project, for example, you're going to go in and you're going to say, we went into this trying to get to this place, and this is the approach that we were trying to do. And along the way, this is what we try. These are the things that went right. These are the things that went wrong. And just kind of giving some context to the answers you're bringing to the table. And a lot of times it's really boils down to just explaining context. So often there's drops in we give the stats and the numbers, but the context behind it's often missing. And people that don't know how to do it. The easiest way for them to do it is to just tell a story around how they got there, like what was the reasoning behind what they did, and follow the hero arc, so to speak. And it works out better than you'd expect.

[00:05:05] Joe Colantonio Yeah, I think it's a valuable skill, especially now with the AI, it's going to be the human aspect of being able to tell a story that's going to resonate with another human being, I think. For sure.

[00:05:14] Jon Robinson Yep, absolutely. It's the one thing that stories have been a part of human culture for as long as we've been around and have had written and spoken language. That's how we communicate things, and the people that do it effectively are the ones that tend to be the most successful when it comes to delivery on their projects, because it's not just the wrap up. Part of it is telling the story to get people engaged and involved to begin with. Why do they want to get involved? Why are they interested in doing this thing? Selling your story up to management and leadership. Why should they invest in this thing? That context is so important and we need to do more of it.

[00:05:55] Jon Robinson All right. How do you battle a story against another story? A lot of times a business has been sold a story by a vendor about automation. This is what automation is. This is its purpose?

[00:06:05] Jon Robinson Yeah.

[00:06:05] Joe Colantonio And then you have to come back with another story. Well all right, this is actually automation testing and its purpose. How do you tell the story of like how do you define what is automation testing and its purpose?

[00:06:16] Jon Robinson The ones that are most effective are the ones that are grounded in reality and can be tied back to specific successes or failures, or point to details that everybody in the planet has been sold a story or a bag of goods from a vendor that promised X, Y, or Z. And it was exactly that. It was a story. It was a made up fictional thing. But when you're trying to tell that story internally and you can wrap a personal commentary around that, around when we did this, this is what we did, and this is the outcome that we saw and give specific examples where you've seen success or you've seen failure, those tend to resonate more than the marketing speaks, so to say. They connect better because people relate to things that people do more directly and can speak to firsthand knowledge of it. When you ask me a question about something that I just told you a story about, I should be able to give you more detail about that. And if I can't, then, well, maybe there's not much to that story.

[00:07:23] Joe Colantonio Absolutely. Love it. Are there any misconceptions you've come across about what is automation? How to think about automation?

[00:07:30] Jon Robinson Oh. Oh yeah, lots of those. I would think the biggest one is the misconception that automation is this magic bullet. And it's going to solve a lot of things overnight. One of the biggest probably challenges that when I was at Provar, especially because we were selling an automation tool. One of the biggest roadblocks and barriers we had to get over was this idea that we'd say the tool and you'd be effective immediately, and that's not really reality. And no tool can do that, right? It doesn't matter what your solution is, there's always a ramp up time to get to where you want to be. And part of that is establishing how do we establish where we want to go? Where are we now? What are we trying to accomplish, where do we want to get to? And then those tools can have a fighting chance of success. But the number of tools and solutions or automation initiatives that I have seen fail because there was absolutely no planning that went into them. There was a we're going to automate this thing, and we got a bunch of either X developers or really smart guys with Selenium, and we're going to build this new framework and we're going to do amazing things with it. And six months later you have four scripts, sure, you can do a lot more than that, but if you don't know where you're going, you're going to flounder a lot.

[00:08:59] Joe Colantonio I'm sorry. And that's why I loved your session at this year's Automation Guild. You went over, I think, like part of the session was 90 minutes. It's like a strategy for test automation. And you had some key components that you went over like I think it was like a scope, risk analysis, test coverage, and techniques.

[00:09:14] Jon Robinson Yeah. There is a whole. We basically break it down into four chunks and it's discovery, design, develop, and deliver. We call it our 4D model. But you can call it whatever you want. But there's really those four phases where you do a lot of your information gathering up front. You understand the story of where you're at when you get there, you decide where, okay, where do we want to go? What do we want to try and get out of this? What's our goal? Believe it or not, this is one of the things that even well seasoned veterans in the QA and testing space don't really grasp the concept of maturing along the way because they're like, oh, we're doing automation now. And we're just going to continue doing automation. Well, there's a starting, there's a continuum of that, and there's different things that you can do to improve along the way. And so we put together a maturity model that just kind of helps people understand where they're at, both from just a general QA standpoint, but also just in the test automation space. How are you moving along that continuum and are you improving? And are you moving toward what your goals and objectives are as an organization? Because if you're not furthering those, why are you doing it?

[00:10:37] Joe Colantonio So I guess, what is that maturity model then? What do you mean like people aren't growing.

[00:10:43] Jon Robinson Let's walk through. We broke it down into six phases. You're not doing any QA or testing at all. You're basically you just started some development projects. Now, you've done enough work that you need to do some testing. You don't know what that testing necessarily looks like, but you're doing some manual testing. It might be developers, it might be product owners, it might be Joe Blow down the road. Who knows? You realize at a certain point that's not going to cut it. And so you have to do something more. And that's when you start to think about and look at automation. What is this thing called automation. How do we get value out of it. And this is the point where you get the first misunderstanding with automation is that we're going to start automation, and that's going to allow us to get rid of all of our manual testers. And that's anybody that's done automation based and manual testing, very clearly. No, that's not a reality because there are things that just don't make sense to automate. But you'll start there. And that's the first kind of proof point that you'll have to get past. You'll start to split now you're doing some automation, but it's probably still 80% manual for the most part. As you move forward, you might get that more to a 50-50 split. What that looks like from there. What are we trying to accomplish now? Is it just so that we don't have to spend the time doing the testing? Is it so that other areas of the business can take advantage of the test automation that we have in place, so that now the development processes are a lot more simple, they go a lot quicker. How do we enable other people to contribute to this automation suite that we're putting together and those initiatives? And now you're starting to get into some of your CI/CD tooling, and how do you work with your DevOps teams and start to flesh that out in a wider set until you get to the point where you're completely integrated in CI/CD, you check a new test in, it gets run when relevant. You've got it now tagged according to when this test should run based on which feature is being developed, which areas of the applications are going to be impacted. All of those things are now taking into consideration. That continuum, a lot of people get into it and don't understand. There's things further down the road that need to be considered, and so they'll go into it and know one of the things we talk about is your short term versus long term objectives. You go into it and you start building all of this work and you start doing all this automation, but you're not really thinking about the long term goals for that. And then all of a sudden you realize, oh, well, we need to do this. Well, that means we have to start over because we weren't considering that a lot of this effort that we went into it. You might have chosen the tool. And you did your automation based on that tool, as opposed to choosing a tool for the automation you need. Well, that tool might not integrate with your tooling for your DevOps. Now, most tools are not open compatible with other platforms. And so all the work you just did, whether it was a year or five years, you're going to have to start over because that wasn't a consideration for where you were going to go long term. Because the thing was, our company bought a tool. You use that tool as your automation.

[00:13:57] Joe Colantonio That's a great, great point. I don't know, do you recommend people look at roadmaps because I worked for a health care company and they started off the greenfield.

[00:14:05] Jon Robinson I'm sorry about that.

[00:14:07] Joe Colantonio Yeah, it was brutal. They got 90 more. But we start with the greenfield application. It was a web app. And then a year down the road like, oh, now we need integrate with the thick client application. So a doctor could dictate notes to this app. And you're like, gosh, this framework was built around totally web, totally threw us off. So it was on the design phase we messed up?

[00:14:26] Jon Robinson Yeah. No, I mean, I think, one of the things that I highlight in kind of my process is, as you go through it, A, yes, look at your roadmap, look at the things that are no. What are the things coming down the pipe? Understand. Okay. This is where we ultimately want to get to, but we have to do some stuff right now. What are our options for the short term? Do we preclude any of that stuff? But then as you're going through it, you mentioned, you got midway into this process and you realize, oh, they're getting ready to introduce this new thing that we're not capable of integrating with right now. Like we can't support that. So then you have another question that you need to ask. Is that something that requires do we need to do that? If we need to do that, what is the impact of that? And can we pivot the stuff we've already done, or is this a new initiative? And reevaluating as you're going, as new information becomes available, always do a reevaluation because one of the things that you should always consider is sometimes automation is not an effective use of your resources, like it is not always the right answer. Sometimes it is more overhead because that thick client and that work that would be done to create automation for that, because you've already got this well-established platform, the work to either integrate them or do a new initiative might be more than the total time it takes you to test it, period. If you're not going to use that for a long stretch of time and for multiple purposes, it may not be worth doing it. So then the question is why are you considering automation for this? Is it just because you've got a well established automation process that we automate. And so therefore any new thing introduced we have to automate that isn't always the right answer. And I think that's a reevaluation that people need to do as they go through and do things get introduced.

[00:16:23] Joe Colantonio That's a great point. And once again, there was a team that created this thick client application, just like you said. And we're like, do we even need to test it? It's already been tested. So we got to a point. We had a after multiple stories going back and forth. Realize, look, we just need a test up to the point where we send a signal, make sure it's received. They create a little API that would test it. We didn't have to interrupt the thick client application because it's already there.

[00:16:44] Jon Robinson What you just described is 100% what I wish more people would look at the actual requirements for what needs to be tested, as opposed to I have a an application that I have to test through the UI, because a lot of times an API call on one end to the other is all you need. I work a lot in the Salesforce base, and the number of times people will try and test the Salesforce functions and capabilities, the things that Salesforce has built and tested as part of their workflows. When guys you don't need to know how to just create a new lead or a new opportunity. Salesforce has done that six days from Sunday. Like there's no there's absolutely no reason to do that. Use the API, make a call, create a new object and then move on like that's all you do.

[00:17:39] Joe Colantonio But how do you not get sucked into the. That's now how a real user would use our application. And it's like, oh my gosh, it's maddening. How do you handle that objection?

[00:17:48] Jon Robinson I think it's understanding what. So you look at it through the lens of if you look at most test cases that are put together, and I challenge you to do this exercise, or in any test case that you have in your suite, 60% on average of that test case is not the actual test. It's just getting ready for the test. It's all the data prep that you have to do. It's navigating to the right screen. It's all of this stuff that has to get there. And so A, you're doing a lot of things that really aren't actually the testing. So if you're going to treat it as a real user would treat it, you have to start splitting your test cases into two different things. One, you have the functional testing, which is this is how the thing was built. The developer said they were building this thing, and I'm testing to make sure that they built that thing. That is one set of test cases based on user stories, functional requirements, etc. totally makes sense. But the other side of that is how does a user actually use it, that regression suite that a lot of people just tend to migrate from their functional tests into a regression suite. Really, if you stop and think about what you're trying to accomplish is should be a here's a list of all of the functionality that users are supposed to be able to do within our system. It's a pretty static test suite. As new things get added, you add new functionality. As things get deprecated, you take it out. But those things should align with how the user uses the system. Sometimes those will align with your functional test cases. Not often, but that test suite is static and looking at it through that lens. That's a different set of testing though, than the functional test that a lot of people say, oh, we do in sprint automation. Really? Because how's that working out for you? Because I bet about 70-80% of that's maintenance, like your actual testing is probably pretty low.

[00:19:43] Joe Colantonio What do you mean by that? It's just you don't recommend in sprint automation. You do recommend being one and plus one?

[00:19:50] Jon Robinson I like in plus one or even later for automation purposes depending on what you're trying to accomplish.

[00:19:56] Joe Colantonio Okay.

[00:19:56] Jon Robinson Right. So if you've got an application or system that lends itself very nicely, and you've got a strong development team that can hand off things to you that are not going to change, then great, in sprint automation is a noble goal. I think it's the where we should desire to get to. The realities of that though, are think of how many projects you're on, where you come in and in sprint one, you develop a feature, and in sprint two, you realize they left out a bunch of that pieces of the user story or functional requirements. And so now they have to make adjustments and changes to that will now your test data change. And then, you come back into it and in part three they're like you know what? That actually doesn't work the way we thought it was going to work. We need to make a change. So over the course of 4 or 5 sprints, every single time that functionality changes. How much like actual net new automation are you doing versus just maintenance in that scenario. And that is happens on so many user stories and so many pieces of functionality that in isolation, you don't realize it's happening, but when you step back and look at actually how much time are you doing new things and creating new tests as part of that in Sprint Automation, because those things weren't done. It happens all the time.

[00:21:11] Joe Colantonio Yeah, absolutely. So it's hard to balance between the optimal and where you're at. How do you avoid tech debt in that case that's what people have come back for well no tech debt will build up. And once it's automation they're going to be like forget about it. It's it's too much. They're not going to focus on testing test debt. It's going to be pushed down and down and down until you're kind of in a jam.

[00:21:33] Jon Robinson This is where I like to focus. If you're going to start on automation, focus on regression and focus on building test suites that test the functionality from the end users perspective as if it is all said and done. That means often, unfortunately, waiting until the end to build some of that automation, but one, hear this talk about another fallacy or misunderstanding about automation is value to a development project. Its value is not to the version that you're pushing out right now. That is not your value. The value for automation in regression in the long run, is the next version, so that it cuts down on the time you have to spend on the next go around doing all the work that you did this time. The work and effort you're doing now. Sure, it might save you a little bit during this release or this particular version, but the reality is its real value is the next. And that's where a lot of people misunderstand where they're getting the actual value from. I would rather push the automation to later in the process where things are more stable, less maintenance costs, less rework costs, and then spend it so that when I'm running it the next time, I know I'm actually running a valuable test, which allows me to skip the tech debt part because I'm not worried about testing the individual changes that they're making. I'm worried about testing the functionality that is supposed to be there and in place. And so when you make a change the next time, does it still work? Did you break anything?

[00:23:13] Joe Colantonio The benefit more is it's a safety net that developers have confidence in knowing. If I push code, I know I have strong tests tech debt that will give me some sort of confidence in knowing, okay.

[00:23:23] Jon Robinson Yeah. Because you're right, tech debt in and off itself nobody goes back test that. Yeah. It's stuff that developers do on their own off to the side when they have the bandwidth to do it. And it's not something that normally justifies it. The best way to test it is with your regression tests. But the regression tests are only as good as the regression test you have previously built, if you don't already have them.

[00:23:48] Joe Colantonio Absolutely. How do we measure how we're doing? Like how do we have a pulse on it? Do you have like? I know buzzwords ROI, but everyone wants to tell a story about ROI or what they saved or what they did for the company? How does that work?

[00:24:05] Jon Robinson I mean, those are all I mean, one thing I would say don't do is try and automate to a metric or a number because no matter how noble your intentions are, that kind of approach is always subjective to or subject to being gamed. You need to make things look a little bit better, so it's easier to bump up the number of something so that the percentages look better, or and you don't even necessarily realize you're doing. It's just the nature of how things are built and done because you're not moving towards an overall goal of how do we improve the overall quality. This is one of my biggest challenges, just with QA in general is how do you measure quality? How do you actually give something so that you can say, we are improving? One thing that I've done is in the past, I've taken the approach where forget I don't care about bugs during the development process. I don't care how many bugs you come up with. In fact, we went so far as to call the development type bugs things that come up during the development process all the way up and through staging. If there was something that came up through that point in time, I just called them an issue that the developer and your development team and QA team fixed together call it a day. What I cared about and the only number that really mattered is how many things escaped into production. Once a bug, that's a bug. If it's out in the wild, that's a bug. And that allowed us to keep track of was our ability to minimize the number of things getting into production, improving? I don't care whether or not the volume is going up or down before production. What I cared about was once we got to production, because you're never going to catch every bug, right? Anybody that says we're going to have a perfect release and you should have caught that bug because you were responsible, it's a pipe dream. You're never going to do that. But are we catching more or less getting out? That's the important thing. How severe were those issues that were getting out? What was the impact on the business? And those were the kind of metrics that I cared about because those told me the things that we're doing before we get to production, those are working. If I'm finding a bunch of stuff before, but yet I'm still getting a bunch of P1 level bugs reaching production that really I'm not really improving all that much before I get there. But if I find 100 different issues in the development process and I get one that escapes versus ten last time, that's a 90% improvement. I'll take that.

[00:26:54] Joe Colantonio Absolutely love that. Besides being able to measure and know how you're doing better, I find that sometimes actually managing the whole testing process can be kind of daunting, especially with a lot of open source tools. It's almost like back in the day when I had a vendor tool, I had like an end-to-end test management type. It's able to keep track of all the requirements. It was all seamless. Any thoughts on test management nowadays and natural environment?

[00:27:20] Jon Robinson I wish this was a better developed space. I guess is the best way. TestRail did a good job on it and make made some good strides. Then you guys kind of stagnated since they took over. I think they great guys went and did TestMo and it built a pretty good platform there for test management. But to your point about like multiple open source tools and all these different things. One of the things that I found to be a successful way of looking at it is to treat your different tools that you use as their own development projects in and off themselves, and give responsibility to the teams to build and own that platform because at the end of the day, it is a fact whether you're doing manual or automation, it's still a creation type project. It's still a development project. A lot of automation is very code heavy. A lot of manual is not. But the basic principles are the same. Treat it as its own project and its own thing. And so you manage and track those things and have owners for each of the things the same as you would a development project because we tend to just bring in a bunch of tools and say, here, use this. And because of the high turnover, there's no ownership of them. Nobody really is making sure that the test cases are up to date. Where all the test results stored? How do we run the right set of tests at the right time? What is automated? What's not automated? What tools are we using to do performance testing versus security testing versus functional regression, smoke, whatever. Those things should be defined and have owners of them. So when you do have change, you can manage it.

[00:29:04] Jon Robinson Love it. As I mentioned it, Jon did a 90-minute session at this year's automation and a lot of people don't know you can still get access to all the recordings after the fact, instant access. So if this sounds like something you want to take a deeper dive on, definitely hand it over automationguild.com. Get a ticket and you can actually get Jon's video. Very interactive and had a lot of questions from the folks there. And you get access to watch 24/7 community, and Jon's in there. So you can ask some questions directly in there as well. Okay, Jon, before we go, any parting words of wisdom or one piece of actionable advice you can give to someone to help them with their automation testing efforts and the best way to find and contact you?

[00:29:40] Jon Robinson I would say the best piece of advice is don't just automate for the sake of automating. Go into it with a plan. Take the time upfront to really think about what you're trying to accomplish. A little bit of time and energy spent trying to establish a plan is way more effective than just blindly jumping in. You'll save yourself a lot of heartache. But beyond that. You can find me on the Heartbeat community. You can find me on whatever they're calling, what used to be called Twitter these days @jumpmancol find me on LinkedIn under Jon Robinson Nekst IT. And yeah, I'd love to chat about this stuff. So if you are interested, just holler and we'll talk for days on end, I promise.

[00:30:23] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a494. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:30:59] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

SimpleQA, Playwright in DevOps, Testing too big? TGNS140

Posted on 11/04/2024

About This Episode: Are your tests too big? How can you use AI-powered ...

Mudit Singh TestGuild Automation Feature

AI as Your Testing Assistant with Mudit Singh

Posted on 11/03/2024

About This Episode: In this episode, we explore the future of automation, where ...

Eli Farhood TestGuild DevOps Toolchain

The Emerging Threats of AI with Eli Farhood

Posted on 10/30/2024

About this DevOps Toolchain Episode: Today, you're in for a treat with Eli ...