About This Episode:
Are your automation goals an illusion? In this episode with Paul Grizzaffi, a prominent software testing and automation figure, we delved into the intriguing concept of the “Test Automation Illusion,” using Styx's classic album “The Grand Illusion” as a thematic backdrop. Grizzaffi shared his insights on the misconceptions surrounding automation in testing, the potential pitfalls of blindly following big-name companies' strategies and the importance of striking a balance between automation and human expertise. We also explored the critical aspects of weighing the costs and benefits when adopting new automation tools and understanding the actual value of automation when placed in the hands of experienced professionals. This engaging conversation sheds light on the realities of test automation and offers valuable guidance for those looking to navigate the ever-evolving world of software testing. I also pit ChatGPT's knowledge of Styx songs and testing against Paul's – who wins? Listen in to find out!
Exclusive Sponsor
The Test Guild Automation Podcast is sponsored by the fantastic folks at Sauce Labs. Try it for free today!
About Paul Grizzaffi
As a Senior Automation Architect at Vaco, Paul Grizzaffi is following his passion for providing technology solutions to testing, QE, and QA organizations, including automation assessments, implementations, and through activities benefiting the broader testing community. An accomplished keynote speaker and writer, Paul has spoken at both local and national conferences and meetings. He is an advisor to Software Test Professionals and STPCon, as well as a member of the Industry Advisory Board of the Advanced Research Center for Software Testing and Quality Assurance (STQA) at UT Dallas where he is a frequent guest lecturer. When not spouting 80s metal lyrics, Paul enjoys sharing his experiences and learnings from other testing professionals; his mostly cogent thoughts can be read on his blog www.responsibleautomation.wordpress.com.
Connect with Paul Grizzaffi
-
- Company: www.vaco.com
- Blog: www.responsibleautomation.wordpress.com
- LinkedIn: www.linkedin.com/in/paulgrizzaffi/
- Twitter: www.twitter.com/pgrizzaffi
- YouTube:www.youtube.com/@PaulGrizzaffi/playlists
Rate and Review TestGuild
Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.
[00:00:20] Joe Colantonio Hey, it's Joe. Welcome to another episode of the TestGuild Automation Podcast. And today, we have the awesome Paul Grizzaffi joining The Guild again to talk all about, Welcome to the Test Automation Illusion and also a new development in his career. I think you definitely want to stick around here to hear about. If you don't know, Paul is a senior automation architect at Vaco. He is following his passion for providing technology solutions to testing, QE, and QA Organizations, including a bunch of different things like automation, assessments, implementation, and their activities benefiting the broader testing community. He talks everywhere. He shares his knowledge everywhere. He's an accomplished keynote speaker and writer, and Paul has spoken at both local and national conferences and meetings. He's an advisor to software test Professionals in STPCon, as well as a member of the Industry Advisory Board of the Advanced Research Center for Software Testing and Quality Assurance at UT Dallas, where he is a frequent guest lecturer. Whatnot spouting 80s metal lyrics, which is Paul is known for. He enjoys sharing his experience and learnings from other testing professionals. Well, his place to go to hear all about his thoughts is his blog, which I'll have a link for in the show notes, but it's responsibleautomation.wordpress.com. Really excited to have Paul back on the show, you don't want to miss it. Check it out.
[00:01:42] This episode of the TestGuild Automation Podcast is sponsored by the Test Guild. Test Guild offers amazing partnership plans that cater to your brand awareness, lead generation, and thought leadership goals to get your products and services in front of your ideal target audience. Our satisfied clients rave about the results they've seen from partnering with us from boosted event attendance to impressive ROI. Visit our website and let's talk about how Test Guild could take your brand to the next level. Head on over to TestGuild.info and let's talk.
[00:02:14] Joe Colantonio Hey, Paul. Welcome back to the Guild.
[00:02:21] Paul Grizzaffi Hey, Joe. Thanks for having me. I always enjoy coming on to talk with you.
[00:02:24] Joe Colantonio Love having you. The Guild loved you as well. Now, I know I did a pretty extensive bio but I tend to botch things. Did I miss anything?
[00:02:31] Paul Grizzaffi No, you hit all the good stuff there. And the Vaco is the career thing, right? So about two and a half weeks ago, I started a new gig at a company called Vaco, where I'll be doing a lot of automation architecture and then helping to grow that QA Organization along with the director of QA that's already there. I'm really excited about that.
[00:02:49] Joe Colantonio That's awesome. I know a lot of people right now going through transitions in their careers, switching jobs, and a lot of things going on in the industry. Any advice? I know we didn't plan this, but any advice for people switching to new jobs that you're pretty fresh? How people can hit the ground running maybe?
[00:03:04] Paul Grizzaffi It's interesting. There are a lot of different approaches for different jobs. What I did was I networked. I use my network, and then people that knew people got me in touch with other people. I talked to a whole bunch of people. And then finally, a role that was a good match for me and a good match for, as it turned out, Vaco. We're just a good match for each other, and it worked out that way. Networking for me was number one. But the other piece of advice that I see a lot of other people give, and I agree with it, is when you look at that list of job qualifications, this many years of that and you can do this and do this other thing, you don't have to necessarily check all the boxes, right? It's not an all-or-nothing thing. If you have most of those skills or you have most of the core skills and maybe not all the desired skills and stuff like that, go ahead and apply because you never know what you get picked up on that someone says, Hey, they don't have this one skill, but while they got these 12 other things that I really might be able to find some use for as well.
[00:04:02] Joe Colantonio Absolutely. And I like how you said it was a good fit for you as well. I don't know if a lot of times a lot of people don't take their needs in consideration, they're just trying to get a job. I know it's hard for some people. They need a job, obviously, but I think making sure it's a good fit for yourself as well, not just the employers, is something people should look for as well. Do you agree with that?
[00:04:19] Paul Grizzaffi Absolutely. You spend a lot of time at the job. There's got to be at least somewhat of a fit for you. I guess every job is going have some part of it where you go, this isn't my favorite, but overall, it has to be more positive than negative or more positive than challenges along the way.
[00:04:35] Joe Colantonio Awesome. All right. I'd like to dive into your latest article, The Grand Illusion. It's based off the Styx album, The Grand Illusion. I'll be honest with you, I was never a fan of Styx. They were a little before my time. Then I saw them in concert opening up for Def Leppard and Tesla, who I love Tesla. They were better than both those bands. I guess that's an illusion. Don't judge a book by its cover if you don't really know them or see them live. But why to pick this particular album or this band, I guess to make a point for this particular blog post?
[00:05:05] Paul Grizzaffi A lot of things sort of went into me getting onto that. I was having an online conversation with somebody about some particular aspects of test automation, and their opinion was very first of all, it was a very apt opinion. But their situation, it was absolutely, what you're doing and what you have at your disposal. I would do it that way as well. Most of the clients that I worked with and have worked with, don't have those means either right now or they don't have the budget for it or they don't have the expertise to work with that toolset. We have to do other things to help them be more effective or more efficient at their jobs. I kind of looked at that and went, If everybody's looking at what everybody else is doing and thinking we need to do what they're doing and they're doing, we need to do it. Google's doing. Facebook and Amazon and whoever else. We're missing one of the real key parts of what the automation is going to bring to us, and that is the value for our context, for whatever it is for that company or that team, what they're trying to do, because everything you look for, you think your neighbors got it made. And that trip my memory over to the lyric from the Styx song. And then I looked at, Wow, a lot of this really would be an illusion because it really looks like one day Google twinkled their nose, right? And boom, they had all this automation up and going, and it didn't happen like that. It doesn't happen like that because it can't.
[00:06:25] Joe Colantonio Absolutely. I love that. I thought we dive into each point of the article on Welcome to the Test Automation Illusion. First one, which I'm sure both our careers, we've seen this since the beginning. As I think you mentioned, automation is also often sold as a Grand Illusion. I don't know if you see a rise in this now with ChatGPT and AI making this even more pervasive within the industry but can you do 100% automation testing or is that a Grand Illusion?
[00:06:49] Paul Grizzaffi I believe it is a Grand Illusion, and it will be probably for certainly for the rest of my career and for my kid's career as well. Whatever they go wind up going into. But sure, that career time there, because if you think about it, directly or indirectly, who is using the software? Who's consuming the software? Humans. So at some point, we have to identify the fact that it has to be human-worthy, human useable. Some eyes have to be on this software at some point. And then if you're going to do it with a human, a tester, does it make sense to also automate that? Maybe it does. Maybe it doesn't. But you have to get the right value out of it. You'll hear me all the time say, value, value, value, value because that's really what it is. Is it valuable to 100% automate something? Never mind the fact that 100% of what? 100% of what is automatable? 100% of everything you can think of. Are you going to point the ChatGPT at it? Get them things that maybe you didn't think of and you say, Cool. I thought of all the things now. No, it's just not feasible at this point. And introducing things like these large learning models and ChatGPT and all the other ones that are there, they're going to muddy the waters because what you're going to see is you're going to see a lot of people taking exactly what these models in these tools give you and say, Cool, that is all I need to do. The A.I told me so. A lot of things are going to be missed just because they're imperfect anyway. But then what was their training data like? What are their heuristics like? What is all of the confidence for what is called the confidence factor? There's a mathematical thing about when it decides if it's something's right or wrong or in or outside of that data set. It's going to be a while before any of that is really going to be trustworthy, such that you can say, okay, it's helped me with this set of what section of what I do, and I don't have to go and double check it or I only have to spot check it because I know statistically it's right more than it's wrong.
[00:08:50] Joe Colantonio How does someone as a tester avoid the illusion if everyone else is tripping or has that illusion, you're the only one that sees it clearly. Like your company says, you know, I mean? Everyone has all trip and all seen like, Wow, man, A.I is going to do everything. You're the only one that's like, Hey, how do you resist buy in into it? Okay, yeah, let's just go with the flow here.
[00:09:12] Paul Grizzaffi First, I'm not against it, right? I'm all for it, Oh, we have this new capability. Let's test it out. But the same way, you wouldn't take a new piece of software. And from scratch, just build it and throw it out and say now it works. What are you going to use it, right? You're going to fool with it. You're going to see what it can do, what it can't do. A lot of people out there are working on what it can do. Myself, the little bit of time that I spent with these tools and some other people sort of with a similar mindset is, okay, let's see what they can't do. Right? Let's ask it. I did this. I said, write a paragraph about test automation in the style of Paul Grizzaffi and I said, Nope, I don't have access to the Internet. I don't know who that is. And I can't write in the style of other people. I said, Okay, write a paragraph of test automation in the style of William Shakespeare, and it did it. It's like, okay, are you wrong? Did you tell a lie? Did you start personifying the A.I, right? So right there, we know that there are some conflicting things that it will tell us about what it thinks it can and can't do. I also did some tests where I plopped in some code that had a bug, but the bug was based on the name of the method, did not match what the loop in the method was doing. So code review would catch this, but ChatGPT did not the first time. Then I put it in again after asking some other questions and it caught it that time. It was weird about what it can and can't catch. But remember, these are relatively early days with this sort of consumer-available command prompt A.I.
[00:10:48] Joe Colantonio Awesome. So I guess even more controversial is when I started my career, it was mostly, you need testing skills and then it came. No, you're a developer. You need developer skills. Testers are developer is, yeah, let's learn all the algorithms and everything. And then now it almost seems like it's swinging back because we're going to have all the heavy lifting done for us with some of these tools. But we need to use our brain to say as a tester, Is this correct? in my context is this correct? Does this find in risk? So do you see a swing back to maybe people having to focus in more on testing skills or am I overthinking it?
[00:11:23] Paul Grizzaffi So I think in general, whether it's testing skills or development skills or whatever these types of tools are going to help, are going to force us, by and large, to work differently. If there's less typing and more thinking. Does that make you any more or less of a developer or a tester? Probably not. It just means that over time, as the tools have gotten better, you built a different skill set. We don't by and large, the average developer does not write in assembly language anymore. There's no need to. We have all these higher-level languages that do, in most cases for most applications, do just exactly what we need them to do, give or take. But we don't have to program in these stone knives and bare skins anymore, right? The A.I tools are going to help us, hopefully, deal with sort of the rote stuff, the dreck of programming, and potentially allow us to be more expressive, more creative. Focus more on problem-solving and less on key typing. At least, that's our kind of hope. We start finding a sweet spot here. Now, does that mean that again? Does that make us more or less of developers or more or less of a tester? No, it's a different skill set, and maybe over time, we get different titles for what we do, role names. But that's going to matter to some people and not matter to other people as well.
[00:12:43] Joe Colantonio Absolutely. Great advice. Like you said, even developers themselves don't really like go of low level. It's mostly cutting and pasting fun libraries together. I think testers need to understand developing concepts just like developers do, yeah, definitely great advice. I guess the next point then is I think you mentioned in the article how a lot of people may look to a larger company and think, Oh wow, this company is automating everything and they're eliminating testers and sometimes creates unrealistic expectations for other companies. I guess can you expand a little bit about this and how people can avoid that as well.
[00:13:21] Paul Grizzaffi Oh, I think it's less about avoiding it and more about avoiding the illusion of that is what you must do because those famous companies do it. A lot of times I know back in the old times when we had a lot of in-person conferences, that question would come up almost every time in one of my sessions. And I would always ask the attendees, Raise your hand if you work for Google, Microsoft, or Facebook. Zero people have raised their hands in any of my sessions. Now, does that mean none of those people have come to my sessions or whatever? But anyway, nobody did. I said, okay, so you don't work for those companies, so you have different needs. You're shipping a different product. You certainly have a different budget, you certainly have a different tolerance for risk, and you certainly have a probably more conservative approach to software development simply because you're not them and you don't have pockets that are that deep. Now, should you be more like them? That's a context question. Do you want to evolve and pivot and behave like them because you find you've done your homework, that you are sufficiently like them, or want to be sufficiently like them, that adopting their practices as is, something that's going to be valuable to you? Or should you look at what they're doing, Take the pieces that make sense for you, right standing on the shoulders of giants, so to speak, but not necessarily going in blindly to say, I'm going to copy this template and we're going to do exactly that because that's probably a bad idea. You're probably not going to get the value you're looking for out of your automation, and either your automation is going to fail or it's going to look like it failed. Right? Another illusion, Because automation doesn't work here. Well, maybe it does. And you just went about it in a way that wasn't context-sensitive.
[00:15:10] Joe Colantonio I worked for a big healthcare company and they're like, we're going to do continuous delivery because that's what Google does. I'm like, okay, we need to work with regulations. We can't just do continuous delivery. They still went forward with it and it was a disaster, not because continuous delivery is bad, but like you said, for our context, because we were trying to copy someone else, and it didn't fit. Is that like one an example you would see?
[00:15:32] Paul Grizzaffi Yeah, that's a prime example of you tried to do it exactly the way they did it and it didn't work. Instead of saying, Oh, I see what they're doing, let me do some homework on that and change the model to fit what we're doing here at large healthcare company. You might have been more successful that way.
[00:15:50] Joe Colantonio It's almost like use them for inspiration, but not blindly just copying them out of just.
[00:15:55] Paul Grizzaffi Exactly. Inspiration is great. Again, that standing on the shoulders of giant's thing, I don't see any reason to reinvent something that is working, or at least could work for you if it's appropriate for you.
[00:16:09] Joe Colantonio Absolutely. I guess another grand illusion that people get sucked into is tools. I'm one of them. I love tools, but there's always, like you said, a shiny automation tool available. When someone sees a new tool, say, like Playwright. Playwright seems to be the hottest one right now. I feel like, Oh, I just learn Playwright. They automatically just think I'm on a switch. When do you know the right time to switch to a new tool? If there is. What are your thoughts on transitioning from one to the other or even seeing if that makes sense to transition from one to the other?
[00:16:38] Paul Grizzaffi So it's a delicate balancing act between not going too early, right? Not being on the bleeding edge where you're taking all the bumps and bruises of learning it and the bugs and all the things in the early days right before other people that have the opportunity to do that, like say, Hey, I'm just starting off in a brand new Greenfield automation initiative, you got a lot more choices, right? Because you don't have the sunk cost already. You don't have the, what do I do with my existing stuff? How do I train my staff on the other thing too, Right? I've already eaten these costs. Do I really need to eat some more costs? So it's a different financial model for that. And I'm no financial expert, but I know enough words to say, yes, there's a cost here. And that's the opportunity cost because if you're taking time to learn and build in a new tool into your toolchain, there are other things you're not doing. You're not extending your existing automation, you're not doing additional testing. There's something you're not doing there because you can't do two things at launch, right? You can't. There's a cost. But there's a cost the other way around too. If you stick on your old tool too long, it's going to get more and more expensive to make a switch when it is time to switch. That's a different opportunity cost and that's where you really have to sit down with your leadership, with your dev leadership, with your pipeline and infrastructure leadership, and your product leadership and automation. Everybody has to sit down and go, What does the product roadmap look like as far as we can tell, and I don't mean features, I mean, okay, let's look at what, we're using React. What's the React roadmap? When is the new React going to come out and is it going to break our existing automation? Or is it going to have stuff in it that the existing tool is not going to do or won't be able to do ever or something like that? Those will start impacting and crafting your solution and your answer to say, When do I make a switch? An answer might be never. It might be that you've got so much invested in what you have. It is a better financial approach for you to make do with that, knowing that there are gaps that you can fill in differently or and not everybody wants to hear this. Sometimes, two tools is the right answer. Hey, for 90% of what we're doing, we're using our existing tool even going forward. But for the subset of stuff that the tool can't do, we're going to supplement with this other tool knowing that we're going have to re-implement some things and or be some parity and some additional maintenance there. But it may be worth it if you know that you're going to get additional gains in efficiency or coverage or whatever it is time to market that is going to realize some real financial gains for your company. And yeah, you eroded a little bit from that because you had two tools, but the delta is so large that you, it's well worth it to do it. So there's really no stock answer there. The right thing to do is keep an eye out, look at what's coming out, and look at what the vendors are giving, even if you're an open-source. We do everything open source because reasons look at what the vendors are doing because at some point that may tip over as well. Again, either as maybe not as a tool replacement, but perhaps as a tool supplement, and never be caught where you don't have an opinion on one of the current tools that are out there. Maybe not all of them, but at least for the ones that likely would be good candidates for you in the future. Have a couple of paragraphs. I have a little white paper for each of them where, yes, you took some time and you had one of your experts download it, prototype with it, and give an opinion on it so that you're prepared if something comes along that says, Ooh, this might be a good time to change what we change to?
[00:20:18] Joe Colantonio I love that. Almost like a file of research that you've already done. So when it comes to right away, you can say, Wait a minute, here are some pros and cons. I've actually looked at such a great approach. I think. I guess it's also one illusion. I know if you cover this, a lot of times people just like open source is the only way when in fact sometimes the research a vendor tool may apply better to your particular situation. I don't know if you see that as an illusion or not, but.
[00:20:42] Paul Grizzaffi No, I did not cover that in there. But that's actually maybe I'll write an addendum about that one. But no, I find it a lot as a consultant that you're going in, you're going to help them with this, but they've already made the decision, Well, you have to go open source. Why? Because it's free. Okay. Yes, but let me explain actual costs and effort costs and all this from a conceptual standpoint for you. And there may be things you can't do because these open-source tools can't handle X, Y, Z, but this vendor supply tool can. Just various things like that. And it's funny and it's I grudgingly say this is that the development departments and organizations will grit their teeth, but they will get that to software developer, Visual Studio Enterprise license to do their job. But they're not going to spend a few hundred bucks a license for a test tool. Can't Selenium do that? No, Selenium can not do thick clients. All right, so now we have to go Appium, right?Okay, but Appium now we're going to introduce the WinApp driver, right? Or we could go with this vendor supply tool that's going to cover some of that for you. No right or wrong answers. If you got to sit down and understand what your cost in your value propositions are.
[00:21:52] Joe Colantonio 100%. I just have another example. As you mentioned, you need to do a cost analysis. What makes sense? Sometimes people thin Selenium is the right answer. Well, this is a while ago we used to do QTP and they said, Oh, we're going to just go open source. But we've worked once again for a hospital and everyone was on older versions of IE, which Selenium was really not good with. They made the switch over and it was terrible because not the Selenium was terrible, but because QTP handled the older browsers much better. And as you said, if someone had done the research that could say yes Selenium, maybe for this other project would be the right choice. But this current one, it's not going to work.
[00:22:26] Paul Grizzaffi Right. And that's another thing that I remember. I've written about it, but I've talked about it a few times just in a company of any sufficiently large size with a sufficiently large number of disparate products, you're probably going to have more than one test tool. One automation test tool. Simply because you're just going to have so many different things that one tool and one framework may not be enough. You may wind up losing more value than if you brought in a second tool.
[00:22:58] Joe Colantonio Absolutely. I guess another illusion is the automation to handle that we can have anyone use it because it's so user-friendly. Do you see that as an illusion? Maybe giving someone what's considered a powerful automation tool and taking that they'll just be able to get up and running with or without any issues?
[00:23:15] Paul Grizzaffi Yes, because I find that is an illusion and it's a rather dangerous one because every time we make a good strive into saving effort with automation that comes back up again, Oh, now I can have anybody write these tests. Well, you can have anybody click the keys, right? You got anybody drag and drop or record or tell whatever GPT that you want a test case to do this. Well, it can't be just anybody, right? First of all, they have to have domain knowledge. Second of all, they have to at least sort of know how to test. Okay, Maybe they don't need to know how to write in a programing language, but, no matter what we're doing under the hood somewhere, right? Behind the curtain with the wizard and all that. It's software and we're developing software. We're sequencing instructions. We have to save those artifacts. We have to look at the results of those artifacts. We have to maintain those artifacts. And yes, the A.I. machine learning changes the dynamic on what it means to maintain those things. But those operations, those activities still have to be there. So if you say, hey, person that doesn't know anything about testing. But yeah, you worked in the medical industry for a while, hired you in and you're going to test this revenue cycle management tool for this healthcare company. That's the way it works.
[00:24:40] Joe Colantonio Sure. So, Paul, I thought we'd have you, since you're a music expert and I thought, I chose Styx. How you would rate ChatGPT's understanding of Styx with software testing? So what I did, I said, I want the ChatGPT. And I said, Give me other analogies other than polls on how testing relates to Styx songs. So here the top five allow you to rate how ChatGPT did.
[00:25:04] Paul Grizzaffi All right, let's go.
[00:25:06] Joe Colantonio First one. It came up with this, Come Sail Away. It says it's like exploratory testing. The song Come Sails Away can be seen as an analogy for exploratory testing, where testers embark on a journey to discover the application's behavior without a predefined test plan. In context, How would you rate that?
[00:25:24] Paul Grizzaffi Rating based on title? I gave it an 8. Rating based on song content. It's like a 2, right? Because the song really is about aliens.
[00:25:36] Joe Colantonio ChatGPT once again failed in that context, and you need to think about it. Awesome. The second one was Renegade, said Penetration Testing. Renegade can be compared to Penetration Testing, where testers act as Renegades or ethical hackers to identify vulnerabilities in the application or system.
[00:25:54] Paul Grizzaffi So again, based on the title? That's a good 8-9. We're talking about lyrical content. That's another 2 because that's about Kill the Dude and now he's on the run from the law. Like, yeah, no, that's more like a black hat, right? Now you tell me this black hat. I'm going to go. All right? Yeah. All right, I'll give that, like, a 7.
[00:26:12] Joe Colantonio Awesome. All right. Now, this one might be a stretch. Too Much Time On My Hands as it relates to performance testing in that the song Too Much Time On My Hands could be seen as a metaphor for performance testing as I emphasize the importance of time in software testing.
[00:26:28] Paul Grizzaffi I'll give it a 5, that shows a little more introspection into what too much time on my hands means and sort of matches some of the lyrics.
[00:26:36] Joe Colantonio I don't know if you know the song, Show Me The Way, that says usability testing in that it can be associated with usability testing where testers evaluate how user friendly the intuitive an application is.
[00:26:49] Paul Grizzaffi Yeah, Show Me The Way was the first single off the first record after Tommy Shaw left. So the one after us Kilroy Was Here. Yeah. Again, I'm going to give that like another 5 because I don't really remember the lyrics of Show Me the Way.
[00:27:03] Joe Colantonio Absolutely. And the last one is the Blue Collar Man (Long Nights) is like regression testing, and that can be like to regression testing, which involves the repetitive process of retesting previously tested features. Just as the Blue Collar worker endures long nights of hard work.
[00:27:18] Paul Grizzaffi Okay, I'm going to give this one a good solid 7 because. Yeah, yeah, it is a lot of work and it's it's thankless. A lot of times that it's working into the night. Yeah, I'll give that one a good solid 7. But it missed the most obvious one. Right. It missed Mr. Roboto.
[00:27:35] Joe Colantonio Oh yeah, for sure.
[00:27:37] Paul Grizzaffi That would be the most obvious.
[00:27:40] Joe Colantonio All right, so that's just a quick test of ChatGPT, how you still need a tester to know the context of this situation, not just spit out stuff because it sounds right. Absolutely. So what do you think of that, Paul? It's just thought to be a kind of fun segment to sprinkle in to make it a little more engaging.
[00:27:56] Paul Grizzaffi Absolutely. I think it's great because you've heard some of my stuff before. Music inspires me. Automation inspires me, and a lot of times I find these join there and I include them in my blogs or keynotes or whatever.
[00:28:09] Joe Colantonio All right, Paul, as I mentioned, you speak at a lot of different events. You're well known for speaking. Any upcoming events you'll be speaking out that people should check out?
[00:28:18] Paul Grizzaffi Yes. So this coming week I will be speaking at InflectraCon in Washington, D.C. I believe there are still tickets available. You can check it out in InflectraCon.com. And they have taken over part of the old STP Software Test Professionals Conference, the STP contract. So you can go and you can get all your Inflectra knowledge, but you can also get a nonvendor, general testing automation, and DevOps track as well. So you've got choices there. And that is I think I'm speaking on the 21st and then in May, I'm going to be speaking at the QA Global Summit which is online. The basic track is free. The junior or the senior track does have a cost associated with it and you can check that out at Geekle.
[00:29:03] Joe Colantonio Awesome. All right, Paul, before we go, is there one piece of actual advice you can give to someone to help them with avoiding any type of automation testing illusion, and what's the best way to find or contact you?
[00:29:14] Paul Grizzaffi So it's the same actionable advice I give every time I'm on the show here, and that is be responsible with your automation. Don't do stuff just because somebody else did it. Do it because it's going to provide value to your team, your company, your organization, whatever the value target is there. And the best ways to get me are on Twitter @pgrizzaffi, LinkedIn, or if you go to my blog, I've got all that information there and that is responsible automation at WordPress.com.
[00:29:44] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a444. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.
[00:30:20] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.