Creating Process to Enable Quality Automation with Ben Oconis

By Test Guild
  • Share:
Join the Guild for FREE

About This Episode:

Welcome to the TestGuild Automation Podcast! In this episode titled “Ben Creating Processes to Enable Quality Automation,” our host Joe Colantonio sits down with Ben Oconis, the head of QA at Storyblocks. Join us as Ben reflects on his journey in developing tools for observability and shares valuable insights into promoting quality throughout the organization. Ben sheds light on his challenges, from experiencing both an excess and a lack of tools at different times to prioritizing data and observability to understand and address issues. Throughout the episode, Ben emphasizes the importance of proactive monitoring, retroactive analysis, and having conversations with customers to understand their behavior and make necessary adjustments. He shares how his direct relationship with the research team ensures that implemented features are based on customer wants and needs.
Additionally, Ben delves into the significance of collaboration, effective testing strategies, and using the right tools for the job. This episode is a must-listen if you want to optimize your automation processes and enhance overall product quality. So tune in as Joe and Ben dive deep into the world of QA and share invaluable insights to help you succeed in automation. Let's get started!

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Ben Oconis

Ben Oconis

Ben Oconis is the head of QA at Storyblocks, a stock subscription content provider. He created the QA team five years ago after moving from the customer support team. He focuses on optimizing inclusive processes that promote quality throughout the organization—creating Processes to Enable Quality Automation.

Connect with Ben Oconis

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:25] Hey, it's Joe, and welcome to another episode of the Test Guild Automation Podcast. And today we'll be talking with Ben, all about creating processes to enable quality automation. Really excited about the show because actually met Ben face-to-face at QA the or highway, we were talking that's all he make a great guest for the show. He actually took me up on it so I'm really excited about it. If you don't know, Ben is the head of QA at Storyblocks. Actually, View Story blocks. That's cool. He's a stock subscription content provider. He has a lot of experience in that area as well. And he created the QA team five years ago after moving from a customer support team. So he's a bit curious to know how that transition went. And he focuses on optimizing exclusive processes to promote quality throughout the organization. I think a lot of people, sometimes when they think of automation, they just focus on tools and forget about the quality. Ben's here to bring back the quality, creating a process to enable quality assurance. You don't want to miss this episode. Check it out.

[00:01:19] This episode of the TestGuild Automation Podcast is sponsored by the Test Guild. Test Guild offers amazing partnership plans that cater to your brand awareness, lead generation, and thought leadership goals to get your products and services in front of your ideal target audience. Our satisfied clients rave about the results they've seen from partnering with us from boosted event attendance to impressive ROI. Visit our website and let's talk about how Test Guild could take your brand to the next level. Head on over to TestGuild.info and let's talk.

[00:01:54] Joe Colantonio Hey, Ben. Welcome to the Guild.

[00:01:58] Ben Oconis Thanks. I'm excited to be here.

[00:02:00] Joe Colantonio Great to have you. I love real-world experience, especially someone who's built a team. So I love to dive in. Before we do, though, is there anything I missed in the bio that you want the Guild to know more about? I usually botch people's intros.

[00:02:11] Ben Oconis Yeah. I think that's a good general overall summary of where I am and kind of like a click thing. Quick flip on my experience for sure.

[00:02:19] Joe Colantonio Okay, awesome. I mentioned you started off in customer support. How did you make that transition?

[00:02:25] Ben Oconis So in our organization, we were a relatively small organization time kind of a startup. And at that time, we had an external team where we were sending out for we were having QA projects that needed to be done. And as part of that, our engineering team was going over to them and saying, here are the requirements we think need to be met. Please check it out. But we kept finding that all of these bugs kept coming out. I was leading the customer support team. And so I would need customers who would reach out and say, Hey, this is broken, or this didn't build properly or I'm seeing an issue on your site. Reflectively, I was the person who kept reaching out to the engineers and saying, Something's not happening here. We're missing this. The quality of the product we're seeing the customers when you are deploy these features isn't what we expect. What's going on? And so, as we were expanding out of our engineering team, the organization said, Hey, we have this opportunity. We want to bring our quality internally versus having as an external company we hire. Would you like to join? And at the time, I had no experience with anything related to software testing, or QA. But I was like, This seems like a really cool idea. I'm good at finding these bugs. Maybe I can help them find the issues. And so it became an open role. I started out as a team of one and literally, they just sat me down and said, okay, go. And so I started by looking at a bunch of different resources. Some of them starting out were things like Test Guild and things like the Ministry of Testing. I was looking for resources to say like, What should you do? How should I do it? And what processes should I have? And it kind of developed from there.

[00:04:02] Joe Colantonio So I guess where did you start then? I mean, you had to find out. You just jumped into it. And also you had all these options. Was there one area you said, okay, let me at least start here as the building block and build off that?

[00:04:14] Ben Oconis Yes. So at the time, as I said, we were receiving a lot of issues and bugs. And what I found out is we didn't have a process in place for reporting bugs, for managing bugs, or a way to easily follow up on them. I started out by first identifying and defining what types of bugs we had. What are the common bugs we have? What are the issues? And I assigned priorities and SLAs to each of those. I met with all of the individual team leaders and sat down and said, okay, look, these are the types of bugs we see coming in. Here are our internal and external stakeholders. And here is how I believe, based on what I'm seeing and what I've heard from customers, we should map out the critical, the highs, the mediums, and so on. And then I started tracking those and saying, Are we seeing a reduction in bugs by introducing some form of tracking and issues as we began to do the manual QA and give some insight into the requirements? Previously, a lot of the requirements were fairly blank tickets. Do this action. And when I pulled that up, it was basically like, Well, what does that mean? What do you want us to do here? And so my push for a lot of the team was, we need more detail, we need acceptance criteria, we need user stories. We need to have some form of information that when an engineer goes to create this feature, they're able to then look at that and say, okay, I've done the definition, I've done, I've met the criteria, and then when I get it as a tester, I'm able to say, okay, you've added the correct automation here. The manual testing looks good, everything looks okay. Now we can ship it to the customer.

[00:05:47] Joe Colantonio Love it. What I really like is once again, you start off in customer support and that's what I think a lot of people miss out on. They forget about the customer and then that's where it all starts. I love how you're actually influenced by what actually happened to your customer and production and you can actually do like a 1 to 1 map and probably like we're not just a cost center, we've increased customer satisfaction by 10% or something. How do you maintain that though? So it's been five years since you've been in customer support. How do you still get that customer feedback then to re-educate what's going on with your quality process?

[00:06:20] Ben Oconis Great question. So since that time, we've added a research team, so I have a really direct relationship with the research team. So a lot of times when we have features where making sure that the features are implementing the things that the customers want and need. And from a quality perspective, we're making sure that we're discussing, like oftentimes in the research, we find that customers will bring up once desires and needs which translate to me often to some form of an issue or something they experience on our site that hints at a need for something that they want. So they might be going in looking for a piece of stock content. And in the search they say, I don't see this functionality here. And to me, quality is beyond the scope of just the testing. It's also the product itself. And are we meeting the needs and the whys of the customer when they're coming to our product? And so keeping with my research team is super helpful. They keep me in the loop, the things I hear. I also every other week I spend a few hours sitting now with the customer support team and talking to customers. That I think has been the number one thing to keep me in the loop. So I think in often testers, especially if they're siloed in their team, sometimes they never like to talk to the people who they're delivering the product for. And when I sit down with the customers, I can get an idea of what are the things that are actually bugging customers. Because I found when we started reporting on bugs, we'd often have these bugs that were reported internally, some engineer or product person or someone goes to use a product, they pull something out and then like, Oh, this is broken. But customers never reported that issue. Customers didn't care about this specific thing. But the thing on the page that they're using most frequently, the tool they're using like for us, it's things like adding items to folders or they're doing assignment flow. When that's broken, when that's not clear when the design isn't laid out properly. That's when I think it has a huge impact on the customers and it helps kind of educate, well, what should we have testing for? What should we be looking for? What is the high-risk stuff that actually brings a business value and thus allows us to have high quality?

[00:08:15] Joe Colantonio I love that approach. I used to work for a healthcare company and we used to make radiology software. And they used to have like almost field trips to hospitals and actually watch radiologists using our application. And you gain so much from doing that. So that's really a great tip. I think a lot of people miss out on. How do you get that buy-in though? It seems like you almost have a culture then that accepts that this is the way. It's not just a checkmark on the definition of done is this actually going to help the customer? And then you're able to measure did it actually help the customer. How did you become just a one-person show and then kind of push that throughout the organization?

[00:08:49] Ben Oconis I think you have to be okay to allow people to ask the why, Why are we doing this? What is the purpose of this? And within our organization, we're fortunate in that if I go to my product person and I say, like, what's up with this? Does this really meet the needs of the customer? Like, is this something we want to do? They're willing to accept that. We think it has a cultural aspect. It did take time to build. Starting out. When I started, I had no experience and sometimes I'd be like, I don't think this makes sense. And there was pushback. I'd be like, okay, what's going on? But I was able to use data and kind of prove the case out. Now, I use the research team to say, okay, well, I don't think this necessarily is something we should be doing. I don't think this is the best practice. Here's some information showing that's the case. It's more data-driven in the sense, and that allows us to look at that and say, does this make sense? And as such, it kind of builds over time. When one team starts to do it, we show that it was successful that it had some positive impact on the organization. And then other teams start to say, Oh, what's going on there? I see you have I suppose, I see there's a higher quality. Then you have more automation that something is happening on the practice side. We explain what it is and it kind of takes like an A/B test philosophy that expands to have something across the entire organization.

[00:10:02] Joe Colantonio Nice. So besides listening to customers, are there any tools or techniques you use and production? Like do you look at Google Analytics or heat maps or anything like that to help you get an insight before people actually complain?

[00:10:14] Ben Oconis Yeah, so it's been a journey. Starting out, we didn't have anything. Over time, we've kind of developed tools and there have been ups and downs. There have been times when we've had too many tools, so there was a time when we had about four observability tools. All of it said something slightly different, but none of them did the thing we actually wanted to do. And there have been times where we don't have enough tools and issues occur we're like, What's happening? So for us, I really try to lean on the data and the observability, look at things like events, so we can look at events, we can have anomaly detection in place, so we can say is something happening that isn't expected. We also try to see if something happens on the site, like if we get an alert and we have an incident that happens. Sometimes the cause is for us, we have a product on the site, but we also have an API product. We've had issues, for example, where a customer goes to hit our site really hard. They download a bunch of content, they search for a bunch of content or something like that, and it causes latency on the actual product that our non-API customers are using. As part of that, I think you have to take a step back and you have to retroactively say like, if I wasn't aware of this, I'm hearing about this first from the customer, seeing it causing issues on the website. Why was I not aware? And as part of that, we try to retroactively add some form of monitoring observability in place to say this is a specific case that we saw, but we may be seeing it elsewhere. Why don't we add something in place to make sure that we get a report and alert when we see the spike? It also allows us to have conversations with those customers about like, well, what are they doing? Should they be doing this? What's happening? And so we can kind of monitor the behavior and go from there.

[00:11:47] Joe Colantonio So once you find the issue then, based on the customer feedback, how do you fit it back into your developments system? Is it open up as a bug or are you starting to sprint and say, here are some common issues we probably want to get better at? What does that look like?

[00:12:01] Ben Oconis So right now my team is doing a lot of triage for these bugs that come in. We are finding before that, when we didn't have. Previously, it was me. I was getting bugs. They were coming in, was sending them right to the PMS and the teams. But oftentimes the details weren't there, it wasn't sufficient. And so we found we had to have a layer in between. And so when the bugs come in, my team will do some degree of triage and we determine, is this actually a problem that is going to be worth addressing. So is it cost to fix the issue worth the effort we have to put into it? So it's both from a business perspective, it's expensive to fix bugs. So if you're throwing engineering effort, you're throwing designer and stuff like that. You have to factor that in. And then from the perspective of risk, is this something that is happening to a single customer? Is it something that's going to happen to a bunch more customers? And so we triage those bugs and then we put them into the team's backlog and then the product manager for that given team takes them and puts them into next spring. We generally are trying to address as many bugs as we can in a given sprint. That has been a bit of a struggle. For example, when I started I was reporting bugs. I was just putting them there. There's just a backlog of hundreds of bugs that we'll never get to. Now we try to have more of a mentality of if this is truly an issue we try and we determine is worth investigating and putting the effort towards. It's much better to do that while it's fresh, while you have the customer data, while you have the information rather than sit on it, and then either never get to it or have it something that when you go to address, you lack appropriate information for.

[00:13:34] Joe Colantonio Love it. I'm just curious to know, I've seen a lot of people sometimes when people start like a quality process or a quality organization or some people out there that teams response for quality, they're responsible for testing. We could just be focusing on development. Who does the testing in your organization now? Once you identify the bug, you put it on the sprint. Is there a tester in the Sprint team or developers network that you have to test this feature before I release or put it into the code?

[00:14:01] Ben Oconis Right now we have my team is myself and two other testers. We have those testers embedded on the major product teams. We have three major product teams. I'm on one and then the other testers on the other two. When we started out, we found that often I and that's my team was the group doing a lot of the testing in that we were helping kind of write the test. We were doing the manual testing, we were making sure everything was in place, and checking all the boxes. So we're kind of a bottleneck in that regard. Over time, what I've done is as I've gotten requests, I've said to the engineers, I'd like you to show me what you tested. And what's interesting is it started as a conversation of, What? What do you mean, what? Okay, I did this and this. Okay, but did you do this as well? I then pair with them and I walk through. This is what I do. This is how I go about making sure that we check the quality boxes. What's been really interesting is over time, I found that when they approached me and they were like, Hey, I need something to be tested or I need some testing to be done with this manual automated. Otherwise, they've done a lot more. They've tested the majority of things and so often my testing is more just exploratory. I can go ahead and explore the product and the edge cases that they don't necessarily think of because of my product knowledge. And then, I've also done a lot more good QA. I lean heavily into sample testing and pairing. And so a lot of the times when we have a feature we need to test, we have a group session. So if it's a bug that is really impactful, we'll throw together a quick meeting, we'll get a PM, we'll get an engineer, we'll get any additional stakeholders where it may be relevant. We'll just make sure we check the boxes, we make sure that everything is good before it goes out. It's kind of a mixture of others in the organization testing with me kind of being the one to help move that process along so that in the end I can do more of the types of testing that we don't have in place now.

[00:15:48] Joe Colantonio Nice, a lot of times people say, Oh, developers can't test, but I love the idea of peer programming, peer testing over time, you're educating the team to fish for themselves. I love that approach. Any tips for peer programming or peer testing? Do you have a script or is there something you've seen work better than others?

[00:16:08] Ben Oconis Yeah, I think that the two big things I would say is, one, ask what their thinking is when they're writing the code they're writing and how they're developing it. Sometimes I think if you don't express the why, an engineer could be developing something in a way that meets the requirements but doesn't necessarily make sense. Like maybe it's a really roundabout way of doing it, or maybe it's something where you look at it and you're like, Okay, this doesn't function like I think it should. And so I think why kind of gets into the thought of like, this is why I'm developing in this way. And sometimes you find things that help move that along and help you kind of learn and understand which you can then approach and say, okay, I would do this differently. This is how I was thinking this feature should act. What's cool about that is I think sometimes it highlights there's a gap that maybe you don't have all the requirements you need. Maybe there's something that you're not saying or you need to have in addition to what you currently have for that titular feature. And therefore you can highlight is like, Hey, if we want to do this, we should be adding X, Y, and Z. And then, not being afraid to say, I don't know. I think there are a lot of cases where something comes up. I have examples myself. I'll be working with an engineer on a feature. They'll ask how something is supposed to work or they'll ask, like from an automated perspective how I would handle it. And I'm not sure. Especially with new tools that you might have or new engineers who come in from a different organization. They might have an experience where they do things differently. Granted, we try to have a general process that is similar across all teams, but they may bring their own blend to it. And if I'm not sure, instead of, I think trying to work it out, figure it out and I guess pretend that I know what I'm doing. I'd like to just stop there and say, I'm not sure. Let's figure it out together. And so carrying on that and figuring out is the solution what we think it is. They're like, how should they behave? Let's go check it out and ask the PM, or Let's go look at the product ourselves and say, What's the current behavior? What should this be when it's implemented? Or let's talk to customers, let's talk to the research team, and say, Hey, we don't know how this should work. Do you have any input? Like, was there something that a customer said before that might give us some feedback on this?

[00:18:04] Joe Colantonio That's awesome advice. You mentioned automation. I know automation isn't necessarily your area of expertise, but how do you work to support maybe the teams in this area?

[00:18:13] Ben Oconis I think the biggest thing I found is having some form of a strategy in place. I think having good documentation that gives insights to what types of tests your organization uses. Kind of like good examples. What should a good unit test look like? When should they be applied? What is the difference between using a unit test and an end-to-end test, for example? I found that often engineers just aren't sure, and we have all of our engineers writing their own automated tests but we are finding a gap. When I started, when I was like, I'm writing a test, but I'm not sure which level I should do or what type of test I'm going to take my stab at it. And sometimes that can result in a test that doesn't actually do what you intended to. It's not or frankly, it's not testing anything at all. We've had tests where it's just like, look at this page and it's like, well, that can be done at a different level. I think pairing with and reviewing the requirements that we have has been really beneficial. If you have some degree of an understanding of the acceptance criteria for this ticket and you've reviewed it as a group, if you have a section where you're looking at the ticket and you're figuring out what's supposed to be included, you review this group, you have a clear requirement of what's expected. As part of that, we have a conversation around what is the critical part of this product where if this broke and I went over to the PM and I said, Hey, this is broken, they would say we should fix that right away. Having that conversation is really beneficial because it allows us to sit down and say, Here's what needs to be tested, here's what needs to be done from an automation perspective. And then, having tools in place that engineers find are easy to use. For example, when I started, we are using Selenium for all of our end-to-end testing. Selenium is a great product, but because of the work it took to create the test and kind of like the knowledge it takes, I think, to use it to some degree. A lot of our developers were taking a very, very long time to write tests, and so there would be cases where they just didn't write a test because it was too hard. And so about a year or two or so ago, we moved to Cypress because we found that the developers were able to write the test a lot more efficiently, and had a better understanding of it. And while it's not exactly necessarily the tool I would love to use myself. I found it was a tool that was efficient in getting the job done for the audience that was using it, for the developers. And so I think thinking about what you're using and the audience who's using it can sometimes help you think like, okay, I might have a preference for this product, but if I'm not necessarily the one in the weeds day to day getting in there and writing all the automation, it might be better to find a product that works for the group that's going to be using it.

[00:20:54] Joe Colantonio That's Guild Gold right there. That one tip for the show would save a lot of people a lot of headaches. I love that. How do you keep on board then? I met you at a conference. How do you keep up with the latest trends and developments in quality and process management? Are there any other things specific that help you stay up to date on what's happening?

[00:21:11] Ben Oconis Yeah, I try to follow a lot of people on LinkedIn and Twitter and the like. I like to see what are they talking about. What is the thing that they're putting talks out about? But also, I look at Slack forums and things of that nature to see what people's feedback they get when an article comes up and they give comments. Sometimes I think there are a lot of people in conferences, but sometimes those are the people who are just willing to go out and they're willing to talk. Not everyone's like that, but there are a lot of introverts I've met in these situations. Yeah, it's like you have great ideas. And so I think just kind of opening the channels and seeing the people who are giving feedback on the things. They might not always be the people who are the loudest voices and are the people who are at those events presenting, but they often have great ideas. I tried to talk to fellow testers. For me, the Ministry of Testing was quite possibly the biggest jump. When I started testing as I said, I had no idea what to do. I literally googled it. How do I do software testing? I knew how to find bugs, I knew how to do manual testing, but that was about it. And I found the community there really expanded my knowledge. And so that set me up with podcasts like Test Guild. I was like, Oh, cool, okay, there's this really interesting thing I want to listen to. I tried to take the ideas that I find from that, and I tried to share it out with my community in my organization and get their feedback. I also am a big fan of I may not necessarily be like the latest trend. I find a lot of people who come into my organization work at organizations of different sizes and scopes and have a variety of tools they've used. So one thing we like to do is we like to sit down with our new hires. You're an engineer. It just started. We have a quick half-hour meeting. We introduce our ideas, our principles, and what we do on quality, like how we go about handling quality assurance for our organization. Well, we also ask the question of, Tell me what you did. How did you make sure that we checked all the quality boxes in your organization? Walk me through your feature development flow. What's cool about that is sometimes I'll find a tool I had no idea existed. I'll find a technique that I'm like, Oh, okay, cool. Like, for example, risk storming was something that I didn't know about right away. Someone mentioned it when I first met them, when they joined the organization, and was like, Oh, this is really neat. Ensemble testing, the same thing. Someone brought it up as a technique that they use in their org in order to expand their testing scope. When I investigated, it was like, This is really cool, let's try it out.

[00:23:21] Joe Colantonio Onboarding. I love that. A lot of people probably miss out on that. I haven't heard of anyone that really does that. So that's a great, great piece of advice as well. So how do you come up with this? You've just been experimenting it? You just say things and then just iterate. Because a lot of times I see people like this is the process and this is what I do and they never deviate from it.

[00:23:40] Ben Oconis Yeah, I think a lot of times in organizations we like to A/B testing for the customers, but we don't like the A/B testing for ourselves. I'm a big fan of just trying something out. What I like to do is I like to approach a team who I think might be open to the idea. I like to take something that I think is an interesting tool that doesn't have too much of a left or a level of effort. And I like to say, I'm not going to ask you to commit to this. What I'd like to do is see if there's a problem we have. It could be anything from issues with automation to how quickly we're able to velocity for deployment. And I'd like to try this tool out for a few sprints and I want to see if it improves And if you have positive feedback. If you have positive feedback, maybe this is something that we should use across the organization. And so I think you just have to. Like, don't get too married to one tool, don't get stuck on one process. Be willing to try something out. I think it's also awareness of the problems that you have in your organization. So if you're seeing that issues are occurring, take a step back and figure out what's happening. The way I put it is. I tried to take the perspective of if I came into this organization right now and someone said, You're the director of quality, what would I do? How would I do things differently? What tools what I introduce? What processes what I try out? If I have those, what's the blocker to me trying them out? In a lot of cases, teams are open to them because they feel the problem is that that tool or that new process would solve. And so we tried them out as an A/B test. We see if it works and if it has improvements and if it does, great, we find ways to implement that tool as much as we can.

[00:25:14] Joe Colantonio Okay, Ben, before we go, is there one piece of actual advice you can give to someone to help with their quality testing efforts and what's the best way to find or contact you?

[00:25:23] Ben Oconis Yeah. So I think the biggest thing that I would say is to overcommunicate and communicate often. I think a lot of times testers get siloed and they become the voice that doesn't really speak up very often, but they also then sit and say, Oh man, I'm feeling all the pain from this thing that's happening, this process or this tool or this thing that this engineering did. I think if you speak up and you communicate about what's happening and this could be anything from ideas I have to problems I'm feeling on my team to issues that I'm seeing on the site. Sometimes others also feel those issues and sometimes you might find a common bond with someone you didn't realize you had. And so I also like to say, if you are ever communicating, especially if you're pointing out a problem, try to come in mind with a solution. If you've ever been in those situations where someone comes to you and is like, this thing is broken, okay, what would you like me to do with it? I don't know. I'm just letting you know it's broken. But if someone comes to you and says this thing is broken, I would like to try X, It has a lot more of an impact. And so I think if you're communicating about problems you have to think about what solutions you might be able to use. And that doesn't necessarily have to be something that is fully fleshed out. It could just be an idea. I want to try this tool. I want to change your process a little bit. Minor changes go a large way. And then to contact me, LinkedIn, I'm on LinkedIn, I'm on Twitter, or X would be the best way. I think LinkedIn is the primary there.

[00:26:52] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a460. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:27:26] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

SimpleQA, Playwright in DevOps, Testing too big? TGNS140

Posted on 11/04/2024

About This Episode: Are your tests too big? How can you use AI-powered ...

Mudit Singh TestGuild Automation Feature

AI as Your Testing Assistant with Mudit Singh

Posted on 11/03/2024

About This Episode: In this episode, we explore the future of automation, where ...

Eli Farhood TestGuild DevOps Toolchain

The Emerging Threats of AI with Eli Farhood

Posted on 10/30/2024

About this DevOps Toolchain Episode: Today, you're in for a treat with Eli ...