Degrees of Automation with Paul Grizzaffi

By Test Guild
  • Share:
Join the Guild for FREE

About This Episode:

In this episode, Paul Grizzaffi, a senior principal automation architect at Nerdery, discusses his role and how he guides teams to effectively promote test automation and quality.

We'll explore the importance of aligning testing and automation with each client's unique business goals and how Paul identifies and addresses their pain points. Paul also shares valuable insights into avoiding common pitfalls, such as excessive tests and data dependency, and the significance of treating automation projects like software projects through continual and periodic evaluations. We touch on the current state of AI in testing, its potential applications, and the critical need for human oversight.

Discover how to help your teams make informed decisions about automation tools, evaluate opportunity costs, and stress the importance of quality over quantity in testing, all of which are directly relevant to your work in software development and automation.

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Paul Grizzaffi

Paul Grizzafi

Paul Grizzaffi is a QE Solutions Architect at Nerdery where he is following his passion for providing technology solutions to testing, QE, and QA organizations, including automation assessments, implementations, and through activities benefiting the broader testing community. An accomplished keynote speaker and writer, Paul has spoken at both local and national conferences and meetings. He is a member of the Industry Advisory Board of the Advanced Research Center for Software Testing and Quality Assurance (STQA) at UT Dallas where he is a frequent guest lecturer. When not spouting 80s metal lyrics or helping raise his twins, Paul enjoys sharing his experiences and learnings from other testing professionals; his mostly cogent thoughts can be read on his blog https://responsibleautomation.wordpress.com/.

Connect with Paul Grizzaffi

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

 

[00:00:00] In a land of testers, far and wide they journeyed. Seeking answers, seeking skills, seeking a better way. Through the hills they wandered, through treacherous terrain. But then they heard a tale, a podcast they had to obey. Oh, the Test Guild Automation Testing podcast. Guiding testers with automation awesomeness. From ancient realms to modern days, they lead the way. Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold. Oh, the Test Guild Automation Testing podcast. Guiding testers with automation awesomeness. From ancient realms to modern days, they lead the way. Oh, the Test Guild Automation Testing podcast. Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.

[00:00:34] Joe Colantonio What is degrees of automation mean and what are the top three questions you need to ask to find out? That's what we'll be talking all about today with the one and only Paul Grizzaffi . If you don't know, Paul is a senior Principal Automation Architect at Nerdery, where he is following his passion for providing technological services to testing, QE, and QA organizations including automation assessments, implementations, and through activities benefiting the product testing community like interviews like this. He's also an accomplished keynote speaker and writer. And Paul has spoken at multiple local and national conferences and meetings and also at automation guild. And he's also frequently shares his thoughts on his blog. Responsibleautomation.WordPress.com. We'll have a link for that in the show notes down below. And this episode is based around a blog post he recently wrote called Automation State: Answer me these questions and we're going to go over a bunch of other things as well. Really excited to have Paul back on the show. You don't want to miss it. Check it out.

[00:01:34] Joe Colantonio Hey, Paul, welcome back to the Guild.

[00:01:38] Paul Grizzaffi Hey, I'm always happy to be here, man. Thank you.

[00:01:40] Joe Colantonio Awesome. So I just checked in officially, it's been a little over a year since last time you've been on the show. So what you've been up to since then?

[00:01:47] Paul Grizzaffi Jobs. So I started a new job at Nerdery, and I am basically helping the sales team learn how to help sell test automation and quality in general, but also helping everyone else on the team be able to just work at that higher level. And I'm blessed to have several really high functioning people on the team. So that's super cool. And I will also be working on delivery as soon as there are some additional clients for me to work on.

[00:02:18] Joe Colantonio Nice. All right, I'm already off topic, but what does that mean. You're helping sales? Like is it helping with the messaging where you know we add benefit but maybe the way the talking about it doesn't necessarily jive what people are struggling with. Like what is that? How does that work?

[00:02:31] Paul Grizzaffi Well, you know what? It's really more about where the value for testing and the extended value for automation is, right? Automation is expensive, let's not lie. Automation is expensive, but there's a value proposition for most organizations to do some amount of automation. Different organizations will require different types of automation and helping those clients and our salespeople talk to those clients about where the value proposition is for them, not just what you read off of a general thing on the web, even the general stuff that I write. Something specific for them that has a specific value proposition to help them meet their business goals. That's what I'm helping to do.

[00:03:13] Joe Colantonio Nice. So I think that leads in nicely to the main topic of this episode. And that is the degrees of automation. I assume since you work with a lot of different clients, a lot of different organizations, you need to kind of get a pulse on what's going on. And I think you, in your recent blog post, have like three main questions that you use I thought we jump off with those to start. The first one in the blog post you talked about do I have the right automation? What does that mean? And how does that help determine like where an organization needs more investment or have the right automation kind of mix?

[00:03:45] Paul Grizzaffi That's part of what I do is to go in and look at what they're doing and say, are you getting value from what you're doing here? And how is it directly or perhaps indirectly helping you meet one of your business goals? Because the way I look at it is if there's a business goal, pretty much every line of code that gets written in an app or a website or whatever should be helping meet one of those business goals. Testing needs to help that business goal, and automation is direct support for testing, helping to be more effective or more efficient. So do you have the right automation? Are you doing the right thing for your value proposition? There's a second part to that, which is what you're doing or what you're considering doing may be valuable, but if you did this other thing first, it would be more valuable and then give you more effort, time, money, or whatever to do other things either other automation, other testing, other development from a total cost of ownership and an opportunity cost standpoint.

[00:04:50] Joe Colantonio Awesome. When it's talking about this question is more round like risk or is it like money areas of the application?

[00:04:56] Paul Grizzaffi It depends.

[00:04:58] Joe Colantonio Where do you know where to focus and on to know if you do have the right automation or not?

[00:05:01] Paul Grizzaffi These are the conversations we have to have with organizations, whether it's client organizations while I'm a consultant now, and previously when I was actually a full time employee at organizations and companies, it's what are we trying to do? How are we trying to get there? What are we trying to make more money or we're trying to lose less money? What are our problems? So you want a faster time to market? I might suggest something different. Then we have quality problems where we have bugs in production. So that's the kind of conversation and then recommendations I would based on that.

[00:05:34] Joe Colantonio How hard is that conversation to have? Do people get offended or do they? Is it eye opening or is it? I guess it depends on the situation?

[00:05:44] Paul Grizzaffi So yeah, it's the old as the age, Joe. It depends question. But really it does depend. However, one of the things that as a consultant to come in and do is make sure that you're not demonizing the client or saying you've done the wrong thing, is to come in and to partner with them and actually be their trusted business partner to say, all right, let's talk honestly about a few things here. What are you trying to get done? What are the problems you're having? And let's talk about how we might use testing to help facilitate that. And by extension, where can mechanization automation help you further that? Again for the effectiveness or the efficiency. And yes, sometimes that your babyies ugly, conversation, but usually by the time they get to a consultant like myself, they know they have an issue, but they may not know what it is.

[00:06:37] Joe Colantonio How do you figure out what it is though? Do you actually look at the code base after you have this discussion and say, okay, where are these tests running? Like how do you know how to even map what the right automation that they currently have? Like how far do you have to dig in to find that?

[00:06:52] Paul Grizzaffi Funny you bring that up. So our leader has put in place this sort of scale of certainty to uncertainty, right. So for an uncertainty standpoint, there are a lot of questions. It might be what's called an assessment. But assessment kind of is a dirty word for some clients. It's really let's go in and see what you're doing and then build you a high level roadmap of how we can help you get to the next place. Then sometimes we have people who know what their problem is. They just don't know how to solve it. And we come in and ask different questions and approach that in a different way. And then there are the clients that say, look, I have an implementation issue. I have a bandwidth issue. Here's our backlog of technical debt, our backlog of test automation we want it to have but don't have come in and help us build it appropriately. All of those conversations are different because they're different situations, even on top of the fact that it's different humans and different companies and different business goals. How do we have the conversation? We start with what hurts, right? What are your pain points? Where should we spray some medicine. And then we start digging through there.

[00:08:03] Joe Colantonio I assume that scale is proprietary or like are there anything online that they offer? If someone wants to do a self-assessment before they call you in, so they have a little more background on research before they jump into consulting?

[00:08:16] Paul Grizzaffi No, I see what you're saying there. We don't have anything online. It's proprietary. Well, I just told you what it was, so I guess it's not too proprietary. Oops. No, but really it is how we would approach potential clients and say, well, where are you on the certainty scale? And then where would we come out and help you evaluate yourself and get you to the next level? And when I say we, I mean me. I mean the company as all the companies I've worked at. I'm trying not to turn this into a Nerdery sales presentation because nobody wants to hear that. But even if you're not looking for an external consultant, if you have an in-house expert, even if you have an in-house expert, these types of considerations are very valuable to you to step back and say, let me ask these three questions that Paul put in this blog, because it was going to be the six degrees of automation. And then that was just too many. And then it was three degrees and that was wrong. And so I just turned of the questions.

[00:09:14] Joe Colantonio Nice. So in that first question then, do you ever delve into tooling like do I have the right automation and am I using the right tools? Or is that like a separate conversation you would have?

[00:09:25] Paul Grizzaffi It's all part of the bigger conversation, but it's not where I jump in. A lot of times, potential clients or actual clients will say, we're unhappy with our tool. Okay, either the tool doesn't fit your need or the way you've used the tool doesn't fit your need. Let's figure out what that is. So that would all be part of that investigation and recommendation. We'd have all the conversations about what are you trying to do, what you're not try to do, where your pain point, what your culture like. But we would also look at I keep saying we, whoever's doing this right should look at the tooling as well to see if it is appropriate to meet those goals. And if it's not, perhaps a pivot is necessary, perhaps a supplement is necessary.

[00:10:09] Joe Colantonio Nice. The first question do I have the right automation? Second question is do I need more automation? I guess the answer is always yes, right Paul? How's that question always go?

[00:10:19] Paul Grizzaffi Well the answer is obviously always yes, I'm na automation guy. But in reality, like I said in the article okay, great. We've taken our, for example, our smoke suite and it took a human 30 minutes to do it. Now it takes a computer 8 minutes to do it. Sweet. T shirts for everyone. What do you do with that gained information. Is it that your smoke test suite was constrained? The quantity of it was constrained by the 30 minutes? We can only wait 30 minutes. We can only afford that. Now it's 8. Would you add extra things in, or would you say no, 8 is awesome. Now we can do more deployments or more releases or whatever. How do you use what you have saved from automation is really again, it's context dependent. There are all sorts of ideas and ways to do that. I mean, I worked with a team one time. I was an employee there, and it took them 8 hours to do what we might call a regression, for this one component. And I watched what they were doing, and I said, let's figure out how to automate this. And it took a few weeks to do it here and there and little updates to the app, updates to the automation. And then it could do the checking in one minute. Once we got it done, hey, what do you do with the other 7 hours and 59 minutes? Well, what happened was the delivery team that got the developer said, wait, now we can give you more, we can give you more releases so we can get stuff out the door faster. So time to market went down because that initial regression went from a day of two people to a minute of one person. What do you do with that savings? It's kind of up to you. How do we help you meet your business goals? That's how. And then what did you do with the savings or the economy or the efficiency? You figure that back into your larger business plan.

[00:12:06] Joe Colantonio What are some examples of that? How they would invest that extra time savings.

[00:12:11] Paul Grizzaffi Right. So that was part of what I was I guess I didn't say very well before is okay now, instead of taking two releases every three weeks to test, now we can take multiple a week. And potentially that gives us either faster feedback or it allows us to deploy more frequently. It just depending on what you want to get out the door. In our case, it was oh cool, I can give you more releases to test incrementally so that when it goes out the door, we've got more things covered. Other companies might say, oh cool, now I can release three times a week as opposed to twice every other week.

[00:12:50] Joe Colantonio In this step or in this question, are you also trying to determine what kind of automation they have or we just talk about functional automation?

[00:12:56] Paul Grizzaffi Anything. In my world, automation is the judicious use of technology to help people be more effective or more efficient. And again, in my world, those people are generally testers, but I'm also looking at delivery teams as a whole. So if you write a script that some data generation or data checking for you, that's automation. You did something by hand and now a computer does it for you. That's automation. Is it traditional test automation? Nah, it falls into that category that I call automation assist. It's stuff that helps you be more effective and more efficient.

[00:13:32] Joe Colantonio And I love that type of automation. Do you think people are missing out on that type of automation when they look at their when you go in, you're like, oh my gosh, just so many things they could automate. Not necessarily functional automation. That's going to help them determine whether or not they need more automation.

[00:13:45] Paul Grizzaffi Finger in the breeze I would say way over 50%.

[00:13:48] Joe Colantonio Wow. All right. Very cool. How do you get people to buy into that? Is that a hard sell to say, hey, look, I know it's not functional animation, but you probably should automate this process, but the Python script or whatever it is that you're doing, the automation.

[00:14:02] Paul Grizzaffi There's a trust aspect there. And a lot of times you have to talk about what people want to hear. They want to hear about Selenium and Playwright and Appium and the name of your other vendor. I want to come in. I'm going to automate my clicking and my pressing and all the other things. Cool. Nothing wrong with that. However, where are you spending your time? And when you start asking, where are you spending your time and where is help? Where would help be along your pipeline or and your process to be faster, to be better, to be higher quality? The conversation becomes easier to have there because you're not coming in to say, dude, you should automate this other thing, or you should use a script. And now you shouldn't be using whatever UI testing tool because that's a waste of your time or whatever it is. And look at what they're doing. Really be that trusted business partner. Again, whether you're a consultant like me or whether you're a full time employee, where your job is helping get that software out the door at the most effective way, asking the questions, thinking about how it works, and saying, you know what? That's valuable. However, this other thing might be more valuable if we did it first. And you start talking about opportunity cost, which it's a very deep concept, but it's very simple to talk about it just in the ether. If I'm doing thing A, I cannot do thing B, is thing B a higher value than thing A. Now it gets really complicated from there. The opportunity cost there is. You don't get the cost savings or the revenue from thing B because you're working on A, which is not necessarily a bad thing. It might be that A is the better value, but you have to think about it that way about, okay, what are we doing now versus what are we going to do later, and where's our savings and where's this and where's that? It's dangerous as well because you can start getting into that, sort of that sucking down that pit of we always do the thing that gets us the money now or that saves us the money now with long term penalties of, oh, we should have upgraded our version of the framework or whatever. And now we're kind of screwed for a little bit. You have to keep a bigger picture on what's going on and moderate it. But yeah, the core of opportunity cost is if you're doing A, you can't do B at the same time.

[00:16:27] Joe Colantonio Got you. In this step, is it ever like you have a team that's not efficient where maybe they don't have the skills to get them enough automation like, hey, you have this big team and you only have a few automations in place when you could be leveraging it to get you more? Do you ever get into those type of conversations or, hey, you're an enterprise company, you're using this open source tool and your team is better suited for, say, maybe a vendor tool or vice versa. Do you ever get into that type of discussion?

[00:16:53] Paul Grizzaffi If you've seen it, I've seen it. So yes, I guess I've seen everything. But man, I've seen a lot of things. And yes, I would have those conversations about the appropriateness of what you're doing with automation. And sometimes it's no sunk cost. Got to do it. We have to keep with this because the cost to change is so high or oh, hey, I hadn't thought of that. What if we supplemented with this or we switched over to that for these certain types of activities? Because I've worked in that world as well, where we say, all right, for this, this tool is doing a really good job. Let's keep using that. But for this, the tool is not doing the job we need. Let's switch. Yes. Now we have two tools. All right. In a company of any size. When I say size I don't necessarily mean people I mean number of programs, applications, services you provide. At some point, you're probably going to need more than one tool or you need ten? Probably not. I mean, unless you're Google or Amazon or whatever, where you have thousands and thousands of people and hundreds of products and stuff. Raise your hand if you work for Google or Microsoft. Most people don't. So we're really looking at somewhere sub ten products in most people's world. What I worked at the medical company, we had closer to 30, but that's because we were merger and acquisition into a sort of a smaller conglomerate. We had different technologies and we needed different things to do that stuff. But on average one's probably not enough, and five probably starts getting into the too many.

[00:18:31] Joe Colantonio Nice and Paul, this reminds me of a conversation we had a while ago, I think, where you said companies shouldn't imagine themselves against other companies, like you're not a Google if you're like a health care company per se, so you shouldn't be trying to do what Google's doing. And I guess that goes to your first all three questions. Do I have the right automation. Am I understand that correctly. Do you remember that conversation all. Is that ringing a bell?

[00:18:51] Paul Grizzaffi Oh, every conference I talk, when we get to the Q&A part, it's I get to that. So yes, if you don't work for Google, don't do what Google does. It won't work for you or it won't necessarily work for you. Go and look at what they're doing and adapt the parts of what they're doing that are appropriate for you. Then go look at what Microsoft is doing and adapt those parts to you as well. There's nothing wrong with looking at what other people are doing. There's not even anything wrong with copying what they're doing. As long as you sat down and had a reasonable thought process and discussion about, we're gonna do this because and here are the risks that we are prepared to undertake. I really, if you look at it, Joe, at the core of what we're doing with testing is how do we identify risk? We're not in the risk reduction world. We're in the risk reporting world. How do we explain what we've done and just as importantly, what we have not done? And here's what we have found, and here are the things we think we should do next. Are you okay releasing software today, Mr. Senior Leadership.

[00:20:02] Joe Colantonio Love it. Absolutely. All right, Paul, so we went over. Do I have the right automation and do I need more automation? And the third one probably is. Well you see a lot is do I have too much automation. And so what is this all about this question?

[00:20:16] Paul Grizzaffi Of course not, how can you have too much automation, right? Well, there are a lot of different ways. Automation is an expense. Think about it as insurance. You have an insurance premium. Do you want your deductible to be $1,000 on your car or $250 on your car? What's your financial situation? Are you in a position to be able to pay a higher deductible if you have a wreck? Or is it better for you to pay it out slowly on a higher deductible, with a higher premium monthly or yearly or whatever it is? Different people have different situations. Do I have too much automation? Probably. Most people out here have too much automation. There's redundancy. There is the stuff that used to matter but doesn't anymore. But nobody really goes back and looks and says, we have 5000 test scripts. Very proud of you. Are they all valuable? How many test group should I have for this app? I don't know, 1? 5,000? 100,000? I don't know, right. It's all about where's the value and the risk reporting and then how do we get the machines to help us be more effective or more efficient at the risk reporting and going through your automation periodically, either inorganically where you say, okay, once a quarter we're going to go audit this stuff, or we say, hey, we've got these test cases that periodically fail or they always pass. Well, if they always pass, then we should go and look at them and make sure they're actually checking the things we want them to check. So yes, most of us have too much automation simply because we haven't turned around to say, hey, we have this giant trail of cool stuff is of all still helping us do our jobs?

[00:22:00] Joe Colantonio This is a step I see a lot of people struggle with, especially if you say start deleting things because it's almost impressive. Oh, we have 5,000 tests, okay. They're all just testing the same area. That's not getting you anything. But how do you break through that type of culture? Is that something you've seen or heard team struggle with?

[00:22:16] Paul Grizzaffi I have seen it. One of the companies I worked at, we had a problem with that where there were so many tests and so much data dependency that nobody want to touch it. It was only adding, it was never deleting. And then we did sort of an analysis of what if we did delete some stuff? So then the financial guy came in and said every outage we have we potentially actually lose, not like lose an effort or whatever. We actually pay back to the client these thousands of dollars. We're going to keep doing what we're doing. So that was a business decision. It was a terrible technology decision. It was a terrible logic decision from a business standpoint. It was hard to attack that because we're still doing well. Now, at some point at that table's probably going to turn over the other direction and we'll have to have a different conversation. But yeah, that's how you approach it. You approach it with business. It's taking you four hours to run your smoke test suite in automation. What could we cut out of there? Well, we can't cut anything out. Well and we probably could. What if we got it down to two hours? Oh. Well, then we'd be able to do more of X or less of Y or whatever it is when I start looking at it. And every line of code you don't write, is a line of code you don't have to test and support. Similar with the code that exists. Every line you delete is one that you don't have to test and support. And make no mistake, right? Test automation is programing whether using AI or drag and drop or whatever. There's a programing aspect to it. Everything that's less that doesn't diminish your value is less, right. It's more time to go do other things. So talking about it from a value standpoint in helping people deliver their business goals takes the sting out of the cuts about the deleting things, and not everybody's prepared to do it. So if you're not prepared to do it, I understand. We'll go. We'll try to help you find efficiencies and effectiveness elsewhere.

[00:24:15] Joe Colantonio Paul, this week's questions I don't think you just ask once and you're done. Do you put anything in place to be able to monitor throughout quarters to see, hey, ask you to do questions again? Or where do we stand in relation to these type of degrees of automation insights?

[00:24:30] Paul Grizzaffi Right. I wrote a blog a while back about eating stale automation and stuff like that. The idea is you've got to treat your automation like a software project, because again, that's really what it is. When you look at your app, your product, the thing that you either give to your clients or sell to your customers or whatever, you look at it and go, when we're having some efficiency problems here, or rather some code bloat there. When you report to this other framework over here, treat your automation the same way, either organically where okay, every release, every push, every whatever evaluates some portion of it, or inorganically, like I said, where once a quarter we're going to say pick 6,000 scripts randomly and go and just see if they're doing the right thing. And 6,000 is a high number. But just whatever, whatever works for you and your organization, your team and I keep saying this, but it goes back to your business goals. And how do you be more effective and more efficient because really, that's what the automation is about. It's just. Where's the time crunch? Where's the effort crunch in any sort of activity? You see humans, right? I mean, we have to sleep. We have to go the bathroom. We have to eat. We have to get sick. Computers don't do that. Anything you can mechanize, mechanize it responsibly. And then let the humans do the things that are better suited to or at least the stuff that computers are not, and then go on from there.

[00:25:56] Joe Colantonio All right, Paul, I can't end this without a question about AI. I've been asking everyone this lately, AI in testing or AI in general, overhyped, under hyped, or properly hyped?

[00:26:08] Paul Grizzaffi AI, AI, Oh, so it is currently, in my opinion, overhyped because I mentioned this with Scott Moore. Scott Moore and I did a thing late last year or early this year, something where what's the effect of AI on software and testing specifically? It's going to be disproportionate, much like automation was, right? Oh wait, now the computers can do all the testing and a bunch of people get disproportionately affected. They get laid off. They have to go do other things, whatever. AI is going to be the same thing. We are very early in AI. We should not be scared of it. We should be judicious with it and I really like I kick myself every day for not remembering this. Somebody posted that you should treat AI like a new hire or an intern. Get them to do things for you, but you have to check them before you would put them into production. Same thing with AI. You should not blindly take code and go sweet. It works. Unless it makes sense, right? If I get a new hired, go write me a prototype and it does what I expect it to do. Cool. I got ChatGPT to do that for me a couple months ago. Just. Hey, I need a TCP responder. I could have written it myself in a couple of hours. It wrote it in five minutes, so why wouldn't I use it for stuff like that? But for anything that's deeper and longer ranging, especially when you start thinking about maintenance, unless you feed your entire system continually into whatever AI you're using. And maybe copilot does this. I haven't spent a whole time, a lot of time with copilot, but even if it does this, that's awesome. That is thinking about your bigger picture. But a lot of times it only can respond to what you put in. And if you only put in a subset, it's going to respond with that subset, possibly irresponsibly. But it's missing the larger context because you didn't provide it. So it's a little overhyped, but it's going to be really cool eventually, and I don't know when that eventually is.

[00:28:05] Joe Colantonio I think it's pretty cool. Actually wrote the theme song for me. I just said, write a song about Automation Guild and I think it's pretty.

[00:28:11] Paul Grizzaffi Oh no. Totally. Like I said, it is cool, especially when you're doing stuff that's not mission critical, right? That song cool as it was is not going to help operate spaceX. It's not going to help that the rocket go up. I would not let AI do the rocket things without a human checking it. But building a song, building an image and all, that's awesome because I don't have those talents. Okay.

[00:28:37] Joe Colantonio Awesome. Okay, Paul, before we go, is there one piece of actionable advice you can give to someone to help them with their degrees of automation testing, and what's the best way to find or contact you?

[00:28:46] Paul Grizzaffi Absolutely. So it's the same thing I always say at the end of the show or some variation of it be responsible. Don't do automation just because something or someone said you should automate. Look at what value it's going to provide to you, and then decide what and how to automate based on the value. Do you want to find me? You can find me on Twitter @PGrizzaffi. You can find me on LinkedIn. I'm actually on Mastodon and BlueSky as well, and obviously right here on The Test Guild. And I'm actually doing a show for Test Guild later in October. I don't know what the topic is yet, but it'll be something, so watch this space.

[00:29:24] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a501. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:29:59] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

SimpleQA, Playwright in DevOps, Testing too big? TGNS140

Posted on 11/04/2024

About This Episode: Are your tests too big? How can you use AI-powered ...

Mudit Singh TestGuild Automation Feature

AI as Your Testing Assistant with Mudit Singh

Posted on 11/03/2024

About This Episode: In this episode, we explore the future of automation, where ...

Eli Farhood TestGuild DevOps Toolchain

The Emerging Threats of AI with Eli Farhood

Posted on 10/30/2024

About this DevOps Toolchain Episode: Today, you're in for a treat with Eli ...