About This Episode:
In this episode of the TestGuild Podcast, Joe Colantonio chats with veteran QA leader Jim Trentadue about how artificial intelligence is reshaping software testing—and what testers can do to stay ahead. With nearly three decades in the industry, Jim shares hard-earned lessons from his own career, insights into balancing speed and quality in agile teams, and a candid look at AI’s potential risks and rewards for testers. You’ll hear why embracing AI tools, from agentic workflows to prompt engineering, is no longer optional, and why deep testing expertise is still your most valuable asset. Whether you’re a manual tester, automation engineer, or QA leader, this conversation will help you future-proof your career in the rapidly evolving world of software quality.
Exclusive Sponsor
TestGuild community – I'm doing something crazy. Our first-ever IRL event, September 18th in Chicagoland. Just 75 seats, and honestly? This might be the only time I do this.
Jim Trentadue and I are sharing AI-powered testing strategies you can use immediately. Free event, real networking, live Q&A panel. It's your only chance to meet me face-to-face and be part of TestGuild history.
Grab your seat at https://testguild.me/irl – 75 spots only. Don't make me regret taking this leap!
About Jim Trentadue
Jim Trentadue has more than two decades of experience as a director/manager in the software testing field. In his various roles in his testing career, Jim has focused on test execution, automation, management, environment management, standards deployment, and test tool implementation. In the area of offshore testing, Jim has worked with multiple large firms to develop and coordinate cohesive relationships. As a guest speaker at the University of South Florida’s software testing class, Jim mentors students on the testing industry and trends for establishing future job searches and continued training.
Connect with Jim Trentadue
-
- LinkedIn: jitrentadue
- Email: jim.trentadue@outlook.com
Rate and Review TestGuild
Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:35] Joe Colantonio Hey, what if I told you that the biggest threat in your QA career right now is an AI replacing you? It's you ignoring AI. In today's episode, I'm sitting down with Jim Trentadue, a 28 year veteran of the testing world, to uncover exactly how testers can thrive in this new AI powered era. We're going to talk all about why testers who survive and succeed are the ones who actually adapt what skills you must add to your testing toolkit today. How to balance quality with release speed in the age of automation. And we'll also dig into a controversial question. Did we make a mistake turning testers into developers? This is a conversation packed with career changing insights. Stick around all the way to the end because we have a huge announcement as well. You want to miss it? Check it out.
[00:01:19] Hey Test Guild Community, before we get into today's episode, I got something special to share with you today. After years of connecting with you through podcasts and online content, I'm finally doing something I never thought I would do. Hosting our first ever Test Guild IRL event. That's right. In real life. But here's the thing, this might be the first and last time I do this, depending on how it goes. If you've ever wanted to meet me face to face, network with other top QA professionals or learn cutting edge AI testing strategies in person, this is literally your only chance. We're keeping it intimate, just 75 seats in Chicago. On September 18th, I've got Jim Trentadue joining us to share how to balance sprint delivery speed with deep, effective testing using AI power techniques. We're also going to talk real strategies you can use the very next day. No fluff. Plus, there's going to be a live expert panel where you can ask those tough questions that keep you up at night. And yes, there will also be food, networking and even a raffle. But you have to be there to win. Look, being an introvert, I'm really putting myself out there with this event. It's free. It's exclusive. And honestly, I'm nervous about whether people will actually show up. But that's exactly why you should be there, to be part of Test Guild history. Whether it's the beginning of something bigger or a beautiful one-time experience. Head on over to TestGuild.me/IRL and grab your seat. Only 75 available. And when they're gone, they're gone. September 18th, Chicago. Don't let me be that guy talking to an empty room. But seriously though, I can't wait to meet you in person. Let's make this one count. Register now using that link down below. Hope to see you there.
[00:03:01] Joe Colantonio Hey Jim, welcome to The Guild.
[00:03:05] Jim Trentadue Hey Joe, thanks for having me, I'm a big fan, and I'm excited to be on for the first time, so thank you!
[00:03:13] Joe Colantonio Yeah. I'm excited to have you. I've known you for a while. I've heard your name for a while, but we actually first actually physically met, I think at Star East this year. You've been around a while though, Jim. I'm surprised our paths haven't really crossed before. A little bit more, maybe like how you got into testing.
[00:03:30] Jim Trentadue I was a marketing analyst. I started messing around with an application because I was annoyed on how much it was breaking. And I talked to a developers in this, the late 90s, I started talking to developers and I'm like, why don't you just turn it over to me before you go to production? I don't know. Let's do like a little testing phase and just give it to me and let me play around with it before you actually go to production. Oh, so like a little testing phase. I'm like yeah I guess that's what you call it. And they had no idea it was like a formality of an industry. Then I started and I'm like, I kind of like this. I kind looked around and saw some contracting jobs. I'm, like, people actually pay people just to sit and break things and test. I'm like that's amazing. I mean, it was an amazing concept to me when I got into it, like the people were paying for me to actually test and then I walked into it and I didn't know how big of a career it was. I mean, all I thought was give them an app, click around, no formality of test plans, test gate. I knew nothing of that. So it was just fun playing around.
[00:04:37] Joe Colantonio Nice. How'd you learn?
[00:04:39] When I left my job, I went over to a company called A.C Nielsen at the time. And they had a really formal group of folks. I learned from two ladies and one of my manager who taught me everything. And I'm like, I realized what I didn't know. And so I learned about some automation. One of the things they did with me that I loved, and I recommend for people coming in, I knew nothing about automation. They sat me down in the lab and just analyzed the automation runs from the night before. That was every day, that was half my time. Just sitting and analyzing like any of the API runs, any of UI runs and see where the failures were. Go talk to the developers, see if it's a bug. I started just playing around like that and then they showed me the formal way to write test cases, to write some test plans and how to go about a given release. I had some really good mentors that really helped show me the way because I knew nothing about nothing. Almost, this is going on about 28 years ago, but yeah, I knew about nothing. I think I was fortunate enough to have a few good really mentors that took me under the wing. They saw potential in me, but they kind of helped mold it out.
[00:06:07] Joe Colantonio Love it. I think you've been given back as well. I think I've read some of your mentoring students in the University of South Florida. So what advice do you give them then from your perspective now being now the role had change, you're the mentor?
[00:06:20] Jim Trentadue There's a class, a software testing class and from a professor named Alan Hevner, who's a very thought leader and now he's doing an AI class. And it's a lot of kids who are trying, maybe they're intro into the industry, they may have a job as like an analyst or senior analyst and they're taking a master's class in software testing. It's really helping them kind of, Hey, what tools, what kind of open source can I start looking at? What other skills can I harness to really bolster my marketability. It's been a lot of fun with that and especially, and I see the same with a lot of the consulting companies that I work with some of the Infosys, TCS, Wipros, Cognizant, it is really kind of working with some of those folks like as well. They'll bring on some junior level people and we try to help give them some guidance as well.
[00:07:18] Joe Colantonio All right, Jim, so I'm just curious to know, like, are there any deans you see, especially for the younger generation coming up now, especially with the role of AI being hyped more and more within around the world, basically?
[00:07:29] Jim Trentadue I think this generation coming out of this newer generation of testers coming out, I think they've got a huge potential here. And I think it's learning AI, both agentic and prompting to bolster their software development and software testing capabilities. They have to have that as part of their toolkit. Like it's a must now for them to have it as at least part of her toolkit. I think this is where they're going to be able to really shine and go through. To show, hey, you know what, look, this is what I've done this with Selenium. I've done this in Cypress Playwright, but I've also, I've helped AI has helped me build some framework components in here. And it's also helped me identify some other areas as well that I need to be testing. So when they're learning that alongside with what automation skills they have, it's funny as we say this, because some of these AI, we're talking about coding. I don't think coding going away, right? And this vibe coding. I don't know if it's ever going to replace the need for strong coding skills and understanding, because I think more than ever, I think some of your peer reviews and code reviews need to be stronger. You've got to understand what output that your AI is going to be generating. Is this a good structure? Is this, a good foundation? Is this class good? Does it actually utilize my, if it's UI automation, my page object model well, is it doing things as efficiently as possible?
[00:09:07] Joe Colantonio Now, I definitely agree. There's probably going to be a lot more AI slop generated. And I think it's going to make, maybe I'm wrong, applications less secure and less performant. It's almost like testers need to increase their role into knowing more about security and performance as well. I'm not sure if you see if it's similar along those lines or not.
[00:09:26] Jim Trentadue 100%. You know, I'm reading about some, I've taken some work with security teams to learn about some penetration testing. This is prior to AI. Then you learn about the security vulnerabilities that are stored in like Docker containers and Kubernetes, like how they can these can mask themselves in just a container and just kind of ride along to almost take the bus ride into your code. AI like it's one of those things where I don't even know where the AI security starts. And I think that's a lot of us where we're trying to still learn where are those security layers in? Is it something that you're building into your rag? Is it's something that is kind of fire all that your LLM I'm like, where is that? I think we're still learning that, but I did work for a couple companies that really, if you wanted any kind of AI, even POC, you want to do with a vendor, you want to do with ChatGPT or Copilot, Gemini. You had to have such security clearance, because much more than onboarding a new tool. A new tool like, hey, all right, what kind of ports do I have to open? What kind of firewall exceptions do I have to have? What kind VPN access? Very standard and vanilla stuff. But this AI, I think it's gonna be, we're still trying to figure out where the security and vulnerabilities are. And that's a little scary, but I am confident that we have such a strong. There's so many certified security analysts out there and they're coming out in droves and they're going to learn this and we'll be ahead of that. I just don't know if we're there yet, but I'm sure as an industry we'll get there.
[00:11:05] Joe Colantonio Nice. This might be controversial. I think we got things wrong. I don't know if you heard that mantra. Oh, testers are developers too. And almost like they took the focus off of testing, being subject matter experts and more as developers. And a lot of these AI tools are doing a lot the things that you would do as a developer, but not necessarily the testing expertise. Any thoughts on that? I think will need to become subject matter experts again or business analysts again, because that's really the value that we bring if AI is really going to be the hype of creating all this code for us too.
[00:11:40] Jim Trentadue 100%. Remember automation itself. So let's step back a minute. Automation is going to replace all the manual testing jobs. That hasn't happened. And it shows the need. Now there's people that have helped really lessen the gap. Just take one step back in like historical time. There were tools out there that you and I grew up with. Rational robot. WinRunner, Empirix, Segway. So some of these tools that you needed, you needed to have a coding background. One of my favorite, and I got to meet her at Star East, one of my favorite heroes in the QA industry was Linda Hayes, who came up with a whole keyword-driven approach. There were models that kind of brought it along to still bring manual testers in to say, even though I'm a manual tester, I don't know code, but I can contribute to automation by learning these keywords. And we helped bridge that gap. Even though automation engineers, I'd say most organizations that I've seen at least, they still require some level of manual tests to be at least available or scenarios before I go right into developing code or developing a coded solution for my tests. Do we need that necessarily? I don't know if we do all the time, but I do think it's helpful to have some kind of framework that shows, these are at least the scenarios that you think you wanna test from a negative, from a boundary, an equivalence class, like these are ones, and I can hand it over to an automation engineer that'd say, these are scenarios I wanna do. Help me build that framework that can support this.
[00:13:27] Joe Colantonio Absolutely. You did say automation's not the promise of automation was it's going to replace manual testers and never took off. But I did see it replaced testers that didn't embrace it. Or they just dug their heels in and said, I'm not learning any automation tool because it didn't, no, it slowed them down. Someone that knew testing and not at least use the tool seemed to have better opportunity. I think it's the same exact thing with AI. I know a lot of people saying, no it's, this is a different case, but I don't really think it is. Maybe I'm wrong. I think it testers that don't replace that don't embrace AI or at least learn what the hype is. So they could talk intelligently about it, what it can and can't do to the management, probably in jeopardy.
[00:14:09] Jim Trentadue I'm with you, with you 100%. It's a newer way of a technology that could threaten people who want to dig in their heels. I couldn't agree with you. 100% I agree with your because automation, same thing dig in there heels. And I'll, I'll even split it off to say, well, API automation, well you're not exercising the UI, but your business logic is stored at the API layer. And those who dug in their heals to say no, always need a visual representation of test. And it was a continue. Yeah, this is a little different feel, but in the same family of what you're saying, the same concerns, if you dig in your heels to say, I'm not going to use it, then I think you're going to find yourself in a different industry. I asked this to those, why wouldn't we want that? Why wouldn't want to see technology evolve us to be something better? I challenged those testers to say. Do you really want to do regression testing even if you haven't automated it, do you really want to do just basic user story implementation all the time, or do you want to put your true, this is allowing you, if you embrace this, to put true creativity to the testing aspect on display.
[00:15:22] Joe Colantonio Yeah, it makes you be a true tester because it's generating all this stuff, but you need to have the knowledge of the application and context and actually challenge what it's giving you. It's almost like you said, you can come up with all these ideas, give me this, but you're the thinker that has to say, okay, is this right? Is this wrong? And I've been doing some vibe coding just to make sure I'm staying up to speed. And it's great until it's not. It gives the illusion of being awesome until you start digging deep into it. I wouldn't have that confidence now if I didn't stop messing around with it. And I think testers need to at least start doing that. So they could say, what's it doing? What's it not doing? Rather than just taking like a guru's thought on, it's going to replace you or even gurus that say it's garbage. You have to try it yourself, I think.
[00:16:10] Jim Trentadue Joe, 100%. I mean, it's, yeah, try it. I mean look at, you've probably come from some background as well as me, where at one day in time, object-oriented programming was new. When I actually think, and I say this to the, I don't wanna say the more experienced, some of those who are coming in, they have a chance to put AI as part of their toolkit right away. But I think some of who have been around the industry for a while, This is a great opportunity as well because you've seen this. You've seen the shift in waterfall to agile. Like that was a big change, right? If we really think about it, like what is this agile thing? What are you talking about sprints, combat? What? There were these things. We've seen, this is another evolution. And I think the experienced, I don't want to say the older generation. I'll say the experienced generation, they embrace this because they've gone through so many changes with automation, with software developments, they've seen all this. And even the whole concept of like, from procedural to object-oriented programming, that was a big shift too. This is just another big shift that the experience generation, this is another one that you're already used to.
[00:17:31] Joe Colantonio Absolutely. I shouldn't say this. It's almost like people that have experience have a leg up now, I think, because we understand testing and I don't know, like because people are looking. I think AI could probably replace really low level things that probably an intern or a first fresher could have done. And then they don't get the chance then to get that experience where we've already gone over that hump. Once again, not sure if that's true or not.
[00:17:58] Jim Trentadue No, I agree. I think it's going to be a little combination. Like if you're coming out of college, I mean, honestly, you graduating with a computer science degree. You're 4.0 GPA, right? Regardless of where you go, you're probably going to be put on the prod support team. As soon as you come out of College, you probably going to be put in the prods support team to help fix these medium level priority defects, not even tier one or tier two is right. Fix these medium priority defects. And then we get to see your coding style and everything to make sure that you're following our framework. Okay. Now we're going to start to give you higher level and then maybe we'll put you on the sprint teams. And that's okay. You got to earn your stripes. I do think that the experienced generation is going to have a little bit of a leg up on that only because I think they've seen all the dynamics and shifts of the industry so far. So this is one, but this is my, and I think you're saying the same thing we're advising this younger community to come in and say, just walk alongside it hand in hand with what you're learning. You're learning Python, if you're learning that in PyTest or Playwright, whatever IDE that you're really comfortable in, marry that with you're AI kind of capability.
[00:19:16] Joe Colantonio Yeah, they should be embracing it because if they can then go to a management say, Hey, not only do I know testing, I got ..., but I know AI, I'm like an AI first generation, I really embrace it full force. I think that'd be helpful as well.
[00:19:30] Jim Trentadue I think there's some of it, just speaking on AI and look, I've watched webinars. I've watch people talk about AI. Like when we talk about operationalizing it, I really don't think that truly operationalized it. I'm sure there's people bobbing out of this podcast saying, no, no, no, I have. Well, okay. To me, when we say operationalizing AI, that means I, every sprint and I'll put it, I'll focus it at a sprint level because that's where all work is done. Every sprint, it is clear. Like how much we know the velocity and how much AI, tuning is going to be needed. How much re-editing and how much prompting is gonna be needed. We know that velocity every single sprint. If not, I don't know if we can truly say it's operationalized. You know what I mean? I like right now using that, just a standard Fibonacci sequence, right? You can estimate a story points of like the 1, 3, 8, 11. You have a good idea on how to estimate on your story points to give an idea of how big that is to test, how big that is so your scrum master knows. I don't know if we've done that with, from an AI perspective. I don't know if you've seen any other guests that have said that because I talking around the industry, I haven't seen that yet. It's coming.
[00:20:56] Joe Colantonio What do you mean? Like, because we don't have enough knowledge of AI, how much effort it's going to take. We can't really estimate better in our sprints?
[00:21:05] Jim Trentadue Yes, on a regular basis, you as my Scrum Master, I don't think I can give you every sprint. I know how long the AI is going to take to generate. I know how much my AI is gonna take for me to edit and review to make sure it's good. I know for how long it's gonna be to add in my additional scope that maybe this AI missed out. And I can deliver on this sprint. If I can answer those three questions to you, seamlessly. I don't know. I haven't seen it where we're there yet.
[00:21:40] Joe Colantonio Interesting, Jim. I don't know if this is related. I saw an article. I wish I prepared for this. I would say we were almost going back to waterfall because in order to use AI, we need to have the requirements up front, everything needs to be stated effectively so everyone knows what they're building. It's not just these little sprints. At least, until we get AI savvy to go back to Agile. I don't know if that's the same thing or not.
[00:22:03] Jim Trentadue I'm sure I'm gonna have people think I'm insane. I never minded waterfall. I thought that some of the change control aspects, if we could have handled change control aspect better, but it really gave us a mission. And I think there's a lot of companies still do an Agile-ish, and still has waterfall components. And how long has Agile's been here, what? 20 years?
[00:22:31] Joe Colantonio 25 to a 20.
[00:22:33] Jim Trentadue 25 years. Yeah, exactly. So it's been around, but yeah, you're right. And I think there's a lot of experienced folks that are feeling, I'm not there with AI. I'm just dabbling around and I think a lot people are feeling behind. It's almost to tell people you're not behind. It's just, we're all learning this together. And I think it's just interesting. We have a lot people presenting like how they do this am like, you have a field study of about 2 months and we're doing presentations on how good we are with AI. I'm like, I've talked to some really thought leaders and big names in the performance testing and I won't name their names, but they've been plenty of times guests on your show multiple times. And even when I just sit and talk with them, like I haven't really, I haven't spent a lot of time and I have a ton of respect for some of folks like us. Automation architects and performance architects. They're folks I watch as well, and just like all of us, I think we're learning this together.
[00:23:41] Joe Colantonio Yeah, no, I agree. I mean, this is once again, I shouldn't say this. I think it was almost better when it was waterfall and I was the only person doing automation and we had all these business analysts and testers that knew the application and they just focused on testing and then I took whatever they had the test cases and I automated it. Much better than going in a sprint. I worked at a company that had 8 sprint teams, 6 to 7 people on each team, everyone contributing to the effort of automation. It was all garbage, it was all baloney, and it was in a regulated environment. So it's almost like, I can almost see AI taking this approach where we're kind of using that old model almost to start off again.
[00:24:23] Jim Trentadue Yeah, agreed. And you know what, let me add another complexity on that. When we're talking AI, what are we talking? Should we go agentic? Should we go prompting? That leads a whole nother field of area. And in my mind, I've been evaluating and I'm working on different projects where I'm looking at between the two. And then I finally came to a light. I'm like, wait a minute. Why not have a marrying of both? Like, why don't I have, I'm kind of putting it in my mind where the agentic AI is building like 80% of what I need to do, but the agentic take over like this, exactly what you were saying, the intern low level stuff run right through the level stuff. And maybe the stuff that it missed, not blame it to say, or not put criticism on it to see it missed it, it missed 20%. But it got 80. And now I can use my thought process. My brain is the one that's putting it into the prompt engineering. And what are some good contexts? How can I build that out a little better? Basically, it's relying on my knowledge. And that should give comfort to people. Prompt engineering is relying on your knowledge of the system of the application. It's just clearly articulating in a prompting fashion. And then if we can take that prompt output, feed that back into the agentic AI, this is all theoretical. I feel like Sheldon Cooper on big bank theory. This is theoretical physics. This was all theoretical in my mind, but that's what I'm going to try to work on a model that will almost say, agentic, do the heavy lifting, prompting, whatever was missed. Let me put my key thoughts in there, get that prompting output back into my Agentic to build, this should be mostly it. I should have pretty good feel of quality because not only did it read and try to transmit all those stories, but it also took my input in a prompting fashion. I told them to try, but I don't know. We'll see if it works. It's like all of it.
[00:26:39] Joe Colantonio Absolutely. So Jim, it's a lot to talk about and get in in a half hour. That's why I'm excited to make this huge announcement. I think AI has changed a lot of things. And one thing, maybe I'm wrong. I think it's going to push people more to craving real world experiences again. And this leads into the big announcement. I saw you at Star East and I gave you a, Hi, I said, Hey Jim, I've been thinking of doing this on site thing. I probably I've, been thinking about it for a year and a half. I'm not going to do it. And you actually forced me to do. So, you actually helped put me together when the first ever Test Guild in the real life event in Chicago happening on next month. I'm really excited about this idea. Why do you think a real QA community events can help testers nowadays, especially in the age of AI?
[00:27:25] Jim Trentadue We're back. And Joe, your community and what you provide to the community, everyone, this is an unplayed plug, but I've been following Test Guild for a long time. And I think what Joe has provided to the community, I pulled Joe to the back table aside. We have to talk. I'm like, Joe, it's time to take this live, people want to see this. Like this is, you bringing this to the Community, I think is going to be outstanding. Now we get to talk non AI for a second, which is great. People are ready for live events again, people are ready to do, I think you showed, I can't remember anyone else pre COVID who had a successful online system like you did with the Automation Guild. Now everyone's done it because of COVID, but you were like saying, Hey, you know what? I can do this. You can do this on your own time, but I think this is branching out. Showing this, we're going to be going to Chicago, September 18th. It's just going to bring brought a live Test Guild event and in person, I have a presentation to do. I'm going to talk about balancing thorough testing with the speed of releases and the speed of sprints. How do we capture that? Joe's is going to talking about the future of the test. Did you all let you talk about what you're going to discuss?
[00:28:49] Joe Colantonio Yeah, I'm going to just bring up maybe some observations I made. And I speak with a lot of people on my podcast about what is AI, what's it really doing and how people can prepare for it. I'm gonna give some key insights on some maybe three or four key areas they should be focusing on to really help them with their career going forward in the age of AI type deal.
[00:29:07] Jim Trentadue And I think you talk to so many vendors, you talk with so many practitioners. I give a lot of credit to these vendors because they're looking on how to build this. So they're having to do a lot research. You're working with these vendors to get some insight into that. I think, you're going to offer to this Chicago community, or even if you're ready to fly in, I think, you're going offer insights that you're not going to hear anywhere else. I encourage everyone get out there. Get to this event, you're going to start to see it promoted and the webpage is going to be available hopefully later today, very soon. We're very excited about doing this. Like I said, we're going have a couple of sponsors. It's going to also be an association with the Chicago Quality Assurance Association. We're gonna be very, very happy with this and it's just going to a great event for Test Guild.
[00:30:07] Joe Colantonio And I just want to remind people, I'm still a big, big online guy. We will always do Automation Guild. That will be the flagship all online for around the world. I'm just dipping my toe in here just to see if I can expand a little, maybe meet people because I've been doing this for 10 years with Automation Guild, a little more face to face. I'm very much an introvert. This could be the last or the first of many, depending on how many people I can get to turn out and how well I actually do in the real world. We'll see.
[00:30:36] Jim Trentadue This'll be the first of many as, and Joe's done that great. And I'm the other side. I'll get a wall that start to talk back to me. And I'll have conversations with, we'll strike up everything. So I think, this is going to be great opportunity just to bring in the community. And to the Test Guild community I ask you to show up, show out, show Joe that this is worthwhile doing because I'm convincing Joe that we want to do this again. The only way I could do that is the proof is in the pudding. I ask you community, please come on out, support this first ever. It is a free event. There will be food. This first one is sponsored by Lambda test. We want to thank them as well, providing some of the insights. We're going to have three different things going on there, guys. We have presentation. We're gonna have my presentation. We're at Joe's presentation. We're to have a panel discussion where we're going to have some key leaders up there asking. We are gonna have some food and we're gonna have a raffle giveaway. And it's free.
[00:31:40] Joe Colantonio And Jim, I think it's important to mention. Yeah, it's free. You also created some space for community so people can interact with one another as well. So it's not going to be bang, bang, bang, pitch, pitch, pitch. Me and Jim aren't pitching anything. We are sponsored by an awesome vendor. And like Jim said, I love vendors because they suit a lot of companies, a lot of different verticals, a lot of testers. They have a lot of knowledge that you may not have because you have blinders on and your little whatever you're working on. They bring a lot of value as well, and they're not very pitchy, either Jim made sure of that also.
[00:32:12] Jim Trentadue Right, exactly. We will invite our vendors and our sponsors to just tell them two minutes about themselves. But during the presentations, this is industry. This is tool agnostic. This is industry, just thought leaders, regardless of tools.
[00:32:27] Joe Colantonio So I have a bet going with Jim, he thinks it's going to do well. I'm like, oh, we'll see. We have a goal of 75 seats to fill. So we'll see, let us know if I'm right or if I am wrong and Jim's right. I'd love to know. And you can find a link for that down below to register right now. If you're in the Chicago area, I highly recommend you do. And depending on that, you get to vote on whether or not you want to see us in other states, and then you can even probably tell us, Hey, you should go to Dallas next or Nashville, which is close by. That would be cool also. All right, Jim, before we go though, is there one piece of actionable advice you can give to someone to help them with their automation efforts? And what's the best way to find or contact you?
[00:33:03] Jim Trentadue The best way is, so as an independent consultant right now, Jim.Trentadue@outlook.com. But as far as automation advice, guys, learn different frameworks, learn different and see how AI can help you with those frameworks, kind of accelerate those learnings. I've seen a lot of people right now that are very happy with Selenium, but they also want to dabble in some Cypress. They want to dabble on some PyTests. they want a dabble and some Playwright. They want to experience what those are. Selenium is still a foundational tool, but they're looking at other areas as well. Let AI help you kind of accelerate some of those learnings and even create some base level tests.
[00:33:48] Awesome. We'll have a link for that in the comments down below.
[00:33:52] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a557. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.
[00:34:27] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.
[00:35:10] Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.