4 Reasons Enterprise Testing Fails with Autumn Ciliberto

By Test Guild
  • Share:
Join the Guild for FREE
Autumn Ciliberto TestGuild AutomationFeature

About This Episode:

Are you tired of enterprise testing failures? In this episode, Autumn Ciliberto, an Enterprise Account Director at Keysight Technologies, helps to uncover the top four reasons why testing often falls short in large organizations: unrealistic time expectations, business misalignment, infrastructure complexity, and lack of resources and expertise. Take advantage of this episode if you're ready to take your enterprise testing to the next level! You'll learn how to set realistic testing timelines and strategies to align your testing goals with your stakeholders, get buy-in from your management, and ensure that testing is an integral part of your business strategy. Tune in now and learn how to overcome these common obstacles and achieve testing success.

Exclusive Sponsor

The Test Guild Automation Podcast is sponsored by the fantastic folks at Sauce Labs. Try it for free today!

About Autumn Ciliberto

Autumn Ciliberto

Autumn Ciliberto is an Enterprise Account Director at Keysight Technologies, supporting clients with an AI-Powered Test Automation Solution; Eggplant Software. Eggplant Software is a leading test automation solution specializing in digital transformation. Autumn has 10 years of experience selling into the Enterprise space. She recently spoke at Keysight’s corporate conference in Las Vegas on the subject of Identifying Key Personas and Delivering Testing Solutions to C-Level Executives. Autumn provides leading testing teams a solution to support organizational objectives to ‘shift-left’ and keep up with their ever-growing client demands in the digital world.

Connect with Autumn Ciliberto

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:00] Joe Hey, it's Joe. Welcome to another episode of the Test Guild Automation Podcast. Today we'll be talking all about four top reasons enterprise testing fails. I think enterprise testing is critical. A lot of times people look to smaller companies to try to implement techniques that may work there but don't work at the enterprise. So I'm really excited to have an enterprise expert joining us.

[00:00:18] Joe Joining us today, Autumn, who's a enterprise account director at Keysights Technologies, and she supports clients with AI powered tests, automation solutions like Eggplant software. You probably heard of Eggplant software. They've been around for a while. They're one of the leading test automation solutions that specialize in digital transformation. Also, autumn has over ten years of experience selling enterprise space, so she knows the types of things you're probably dealing with when you're trying to interact with your C-suite or executives. When you're trying to get by in for a tool. She also recently spoke at Keysight Corporate Conference in Las Vegas on the subject of identifying key personas and delivering test solutions to C-level executives. Autumn provides leading testing teams a solution to support organizational objectives to shift left. What you need to do nowadays as we try to create software more quicker, faster, high quality, and to keep up with the ever growing client demands in a digital world. You won't miss this episode. Check it out.

[00:01:11] Joe Hey, Autumn, welcome to the Guilt.

[00:01:14] Autumn Hi, Joe. Great to be here. Thanks for having me.

[00:01:18] Joe Awesome. Great to have you. I'm always excited when I talk to folks that really kind of specialized enterprise solutions and and problems. So I guess before we get into it, though, is there any that I missed in your bio that you want the Guild to know more about?

[00:01:30] Autumn No, I think you nailed it. And we'll just share my own personal tips and best practices to help when we are directing executives to get the buy in when it comes to testing. So awesome. I think it's great.

[00:01:44] Joe Great, great. So as I mentioned, you know, you have a lot of experience with the C-level executive suite. I know there's sometimes a disconnect between do do tools bubble donwn to up, but it's come from the top down. So how do you get buy in from, you know, high level CTOs, C-level executives to let them know, hey, here's a solution that's really going to help you? Is that something that usually is driven from the bottom up or from the top down?

[00:02:10] Autumn So it's a great question. And it and it depends. I've seen it both. So depending on the organization and who you're working with, I've seen it where it's the direction from the C-level executives to let their team know, hey, test automation and these advancements in testing are a priority for our business. Get on this, find the right tool. Whereas from the bottom up, it's kind of can be more challenging at times if you don't have that buy in. So ways in which to do that, if these QA teams or development teams, are struggling, is that making sure that their executive team is brought in early on, make them a part of that evaluation process? That's something I ensure to do with my clients. So it limit some of these roadblocks that they have later on. Because you go through all this, work through an evaluation, spend months testing a tool. But if you don't have that initial buy in from those key stakeholders, it will be very challenging for them. And a lot of times projects will be denied because they just don't see the actual purpose and vision that's staring them right in the face to help them when it comes to these testing challenges.

[00:03:17] Joe So what are they looking for? I know it's people laugh at ROI, but for an executive, like what do you put in an executive summary you think really would pop it to say, okay, this is important, let's let's look into this more?

[00:03:28] Autumn Yeah, well, something many executives are if there's ever going client demands and if you're not keeping up with that, you're going to fail as an organization. And that's something that any C-level executive would agree to that we need to get the products that we need out to market faster. So through test automation and identifying the best solutions to advance that testing or eliminate manual testing to get those things rolled out. And by showing that and connecting that synergy between the two, that, hey, this tool will get us to this launch of a product faster, less bugs and defects that can cost the organization millions of dollars at times. That's something that would really resonate with the C-level executive in a way to position it. That would be a way that touches on what they care about.

[00:04:14] Joe Great advice. So, you know, I don't know if you've seen the shift. It goes back and forth. When I started, everything was enterprise level software, especially when you work for health care, software have been kind of vetted and proved and a lot of companies made the shift to open source and kind of like mixing and matching and creating their own solutions. Now I'm seeing a swing back to like more platform type of solutions. Are you seeing the same thing as out? Is that something common or what are your thoughts on that?

[00:04:43] Autumn It is coming. It's coming up more and more in many discussions I've been having over the last couple of years with many of the enterprise physicians I've spoken to, is that they're noticing that open source just isn't a viable solution that's withstanding these kind of demands and testing for the long term. And so they're coming to solutions like Eggplant or they're leading test automation solutions out there to say what is a better way to do this that we can be incorporating into our own testing. And so we're noticing the shift because the ways in which through the open source, it's it's not providing them the necessary means to test complex end to end testing or supporting more of testing coverage through the automation. So for parallel execution. So a lot of that is limited when there's manual testing or even through open source that there's a lot of complex systems are looking to test and you're just not going to be able to do that with an improper solution. And to your point, more of a unification instead of having this disjointed, you know, seven different solutions across various different departments, I think is just not an optimal path forward. Again, going back to my earlier point, there needs to be that unification across the team of what's the best solution for us that all of these individuals are able to utilize in a way in which that will promote the best results for the business?

[00:06:08] Joe Absolutely. I also saw a recent study by Forrester maybe a week ago that talked about how enterprises are really looking for this, almost like an all in one platform just to help encourage everything in one place. Collaboration seems like a big trend for sure. So when you talk to executives, have that been around for a while? Do they talk about things they've been burned by like, Oh, we tried automation before, it doesn't work, or, you know, did they have other barriers that a tester needs to overcome to convince them that you think that they've heard something before and realize that maybe the executive is kind of like a little hesitant to go forward again with something?

[00:06:44] Autumn Joe Too often we hear that, right? That's why you're met with these lot of executives. They try to go down this path. They didn't properly plan for it. They weren't on a unified front for these objectives and it fails. And then they associate the failure with test automation. And that's not the case. There's so many other factors that go into why it failed. And one I think it goes back to even outside the testing tool is are you doing that proper planning upfront for identifying what are the primary objectives for this and more? Are they, especially as test automation is newly adopted into an organization? You need to have the right individuals that are testing it, that have some sort of experience, or at least the proper training, because if you don't have that, it's going to fail. So I think if you're re-introducing test automation to an organization, make sure you know who you're speaking to and are. Are they already equipped to utilize test automation and the experience with that? Do they have the necessary training and the proper project planning and rollout plan to adopt this and one of those business objectives? So you don't in that core set concrete upfront. It doesn't matter the solution that you're adopting, it's most likely going to fail. So I think if you're empowering your QA team as the representatives coming from these test automation tools, with those with that guidance, they'll be much more have the buy in from those key stakeholders to adopt a solution again.

[00:08:13] Joe Now I think the Segway ten segwayss in pretty well with the four top reasons enterprise testing fails. You mentioned technical resources lacking technical resources. Once again, this goes back to a shift I've seen back and forth before was so when use as a tool e.g. recorded playback it's pitched as it's easy as that and that didn't work. Then its oh testers are developers, they need to have hard core development skills and then you try to fill these these roles for a tester that knows Java and it's almost like a unicorn. You never find them. So does that feed into the lack of technical resources, that type of approach?

[00:08:47] Autumn It does. And it's it's putting this this pressure on these individuals that it's really setting them up for failure because they don't have the necessary means for testing or just the bandwidth. I mean, so many times they speak to clients are like, we need support to roll this out. We just don't have enough resources to actually be scripting all of this or building out this tool. So it's one thing that you really need to be assessing is, is there just enough bodies to withstand these projects? And then do they actually have the proper training for this? And going back to your question around the resources for that. So if they're not equipped with any knowledge around test automation, how do we get them to accelerate that training and adopt those methodologies when it comes to new testing tools and platforms? And what's the best way to do that?

[00:09:37] Joe I think what adds even more difficulty is time. So time seems to be speeding up and companies seem to be doing more pivot. So maybe a company was focusing on approach one approach and that all of a sudden like, Oh, the business is investing in this other area. Let's go over here now. Do you see that as something that that causes failures as well with time, a lack of automation, because it's such a it's all over the place rather than focused and it's a good, good process that's been in there for a while.

[00:10:04] Autumn Yeah, no, it's a it's a great question and a good point you bring up because again, it'll be this is not to beat a dead horse, but if you don't have a unified front for like here's the primary objectives and that falls out, we see scope creep all the time, or other priorities will rise because there's already things in production. I mean, we have to keep in mind when you're going through valuation to bring out a new solution, you have to be supporting the existing testing that you have going on in production today. So there's a fine balance between the two to make those shifts and advancements while withstanding what is existing and supporting. So I think a way to prevent this or at least help with finding new solutions again is ensuring that those key stakeholders have a clear vision and that is properly relayed to those teams and everyone is unified on what are the phase one high priority items that we need to get rolled out. A lot of times I worked with clients and through the initial discovery, the want to boil the ocean with everything they're looking to automate or they're really, I think, overshooting what would be as a phase one when bringing on a new solution or for testing. And it's more it's an investment of time and dollars that the organization needs to understand that you're not going to be able to fix everything overnight. But if you do the most critical things early on, you'll have better success.

[00:11:32] Joe Great. So definitely time is one of the top reasons that enterprise testing fails. I guess another one is and you mentioned this is end-to-end testing complexity. So a lot of times at enterprise level you need to. You need test a mainframe. Still, people don't realize there are still mainframes. You need to test a web browser. To test a mobile device, you need to test a hardware sometimes. And I think a lot of times with open source tools, it's fragmentation. So you have Selenium that only does browsers, you have Appium and only does mobile. And then you get this kind of Frankenstein approach, then to test it because it adds more complexity because you need to do end-to-end testing. And probably one workflow, is that something you see and as well as, as an issue.

[00:12:09] Autumn We do and I see this often because I specifically support our enterprise banking enterprise clients and but it can apply to, to all leading enterprise organizations because a lot of times you have these legacy products, applications on mainframes, green screens that you need to be testing and then more modernized technology stack. So these applications that you had were UX/UI testing for browser stacks and they just again, going back to manual testing or something like open source, it's not robust enough to be testing all those complexities end to end. And so you're losing out on what actually the user experience could be. And that's a gap. That's a miss. When it comes to testing, it can be a significant challenge. Working with a leading POS organization, we found that they weren't able to test across their POS systems. And so without doing that, they weren't really seeing, okay, from soup to nuts. Here is some of the areas that are gaps that we have. Here is how an individual is actually using this from a hardware to the actual software. So what are the inner workings of that? And by allowing for test automation, test automation tool, it's more robust. You can capture that more adequately. And I think it goes back to one size doesn't fit all at times. Again, it depends on the infrastructure that you're dealing with, what are the preexisting tools, what's going to be the complexity for replacing any of those existing architecture and really understanding that plan up front again to do the proper rollout?

[00:13:50] Joe That's a great point, especially as more companies go cloud native, it seems like you have to skip from one system to another and interact across systems almost, and that could cause issues as well be able to communicate with a third party system. Where you start off maybe on a web browser? Then you have to do an authentication and then go somewhere else.

[00:14:07] Autumn And look at going from mobile to to desktop and all of the different ways that as user as we can interact with these organizations now. And if they don't have the proper way to test both external but internal workflows as well, because really if those internal workflows for those individuals like their CRM systems or anything from tracking into like robotic process automation. So if they're filling out a form externally and then that data is coming in internally, is that property getting transcended and loaded and what is that process? So at times it's not captured or there's limitations when it comes to reporting insights. So not really sure what areas within the applications are getting most utilized from the user. And so we're going in kind of blind and assuming here is the way in which that a user is using our application, but really you haven't tested so many other areas.

[00:15:02] Joe How do you help testers or companies get over this end-to-end testing complexity? No, This isn't a promotion. It's just, you know, I've used Eggplant, at the company I worked for. I don't know if Eggplant still like this. It used a really interesting approach where it was more image based. So then it did allow you to have like a almost a discrete environment image base. And because it was image based, was able to interact with all these type of technologies. Is that the type of approach you seen kind of work, or is that still the vision of Eggplant of what you all have been doing to help with end-to-end testing.

[00:15:32] Autumn It is. And I think that's been really beneficial because the method in which we're testing through OCR, optical character recognition and imitation recognition is it's allowing us to actually anything that's on the screen or then able to test just like a user would instead of like an open source. But there can be limitations because you're relying upon the DOM and if anything is shifted or changed, which of course there's going to be those changes. And again, if there's any disjoint in this between the dev team and product and something gets rolled out, instead of not being able to capture that or having to start from scratch, you have a more withstanding way to be testing that. The other differentiator that we're seeing, not just thru Eggplant but through other testing tools, is a model based approach to testing. So creating a digital twin and emulation of the application in with your testing. And so what we'll find is through the complexity of End-to-end testing, you can create this primary model that's testing application, but then sub models that then test that end to end workflow and it helps these individuals run their directed tests, but also exploratory testing. So it's opening up a greater scale to which here's the actual coverage and interactions and workflows through all of these different tools making up with an organization.

[00:16:50] Joe So can you talk a little bit more about this model based approach. Does it sniff the interaction of your application then knows the paths that you need to test, is it that smarter is a more you test once to then you can just add things visually. So to add on to that, that model you already have.

[00:17:06] Autumn So once you create it and you've run this so individuals will go through and say, okay, we'll identify to the machine, here's the direct to test. But if you let it run, then freely through exploratory testing, it will continue to build upon the intelligence through AI that here's the ways in which so instead of manual testing, okay, here's 400 tests that we know we've tested, we have to test has taken us hundreds of hours, if not more, by letting this run through an exploratory testing if you look at Eggplant. You're capturing 4500 different user journeys that are identified that you've enabled is no fault of the developers through intelligence or anything. It's that there's just not enough time in the day or manpower to be testing all of those possible user paths. So by letting the tool run, you're identifying all of the other possible different user journeys, but also more intelligent testing, which will be a heat map to identify. Hey, within this particular area of the application, you've actually not tested this at all. And we're finding that this is getting critical interactions from your users.

[00:18:13] Joe I say it's almost risk based testing. Like you have a risk here you didn't even know about. Probably. Now you do and you know it automatically because it bubbles up that inside. Right?

[00:18:21] Autumn So you're catching that even before you're getting to the next stage of it. It's actually a bug. It can be. Did you even know we're going to identify any bugs because now you're going to be able to test that area. That was a mess before.

[00:18:34] Joe So end-to-end to testing. So we went over Time as one of the killers of enterprise projects and end-to-end testing complexity a lot of times. But I see companies starting and planning. They don't look at their full roadmap, so they may start out a greenfield application and not realize two years down the line you're going to have to integrate with, I don't know, a mainframe or a third party service. Is that something people should also look for as well to help them with their testing capabilities actually in the planning phase?

[00:19:00] Autumn Of course, I mean, you have to be thinking about current state for the most important projects we have at hand. But also how will this affect other areas within the business and other teams? I mean, a lot of times it will be siloed. So we'll be working with an initial team that's dealing with this desktop or for a new application, like a mobile based application. But you have to be considering if we're adopting this new tool within the organization, can this be utilized across all of these other areas within the business? So I think ensuring that you have a more robust solution, like an Eggplant to be able to test all those complexities is is so important.

[00:19:44] Joe So another missing piece, and I think it would be the third most killer of enterprise projects is collaboration and visibility or viability. So can you talk a little bit about how collaboration. Especially nowadays with modern software, how it should work, or how that helps ensure a project is successful.

[00:20:03] Autumn Well, I think it's so important up front is ensuring that as a team you're asking yourselves the right questions is things like, you know, what sort of areas do we need to be testing? What are the highest priority areas? I mean, what kind of skill set is required here to adopt this? So there's a lot of pre discovery that I think needs to go on as part of that planning to be successful and really as an organization, there has to be that willingness to invest in this new methodology or advancements when it comes to testing or the right testing tools. And so when you're going through that project planning, all parties really need to be involved to ensure they're choosing the right test automation solution. Because what you want to prevent is you get into that testing phase and you've identified you've selected the wrong tool and you've been there. Right. So unfortunately, so many teams have any of the backtrack and do rework, and that's a huge undertaking for many teams. And so the way to prevent that upfront is which can be painstakingly long but, really aggressive planning upfront and again from that key stakeholders down.

[00:21:16] Joe So this may sound all gloom and doom, but like in a bad economy, if testers are making themselves visible to what they're doing and how they're adding quality to the product, to give the C-level suite an insight into what's happening, do you see that as a key advantage to testers using a tool that allows multiple people, developers, stakeholders, testers and maybe C-level executives have different views into the data and get the right insight that they need so they can know, Oh, okay, I understand I don't need to go in the nitty gritty, but I know that this team found four bugs before it even went to production. So that that's a win or something like that.

[00:21:50] Autumn Absolutely. I mean, as a tester, I would want as much visibility as possible being rolled up because I think it really shows also more visibility into all of the work that they've done. Like here's how many hours you've been putting this testing, here's how many defects or bugs we've identified. A lot of times they don't have that reporting. So it's manual inputs. You're kind of guesswork sometimes. And all of those efforts and success that they've had actually isn't visible to the executives. So I think of it's more thinking that as going back to the collaboration, here is where we are. How do we for the areas that are misses, how do we fix this in an organization versus continuing to repeat the same mistakes and also to the QA to have that visibility? Here's all the success that we've had rolling out. Let's repeat this and here's how we do that.

[00:22:41] Joe Absolutely. You know, I worked with the team. They used an Excel sheet. And so every time someone filled it CI, they list out the failures, try to find out why. But then every month, they'd have to go in front of the executives to say, okay, this was not a real issue. This is a testing script. There are real issues and it was just a mess. So I think no.

[00:22:56] Joe that.

[00:22:57] Joe Definitely.

[00:22:59] Autumn No one want to be supporting an Excel doc these days. Yes, they do. And that goes back to any kind of manual effort that you can alleviate to automate is what needs to be adopted in today's digital fast growing world. I mean, we have the tools available to us, so why not automate these processes? Because you want to alleviate this manual efforts from these brilliant developers that can focus on more important projects at hand that in the long run will drive greater ROI for the business and focus on greater innovation.

[00:23:31] Joe I love that I say all the time. You should be focusing just you should be focusing beyond on functional testing. Anything you could do to automate a process to help deliver software better, faster. Thats something you should be focusing on, and I think that takes you out of just look at that functional test. I think that's a great point. Sure. So I guess so we went over the time is a killer, e2e testing complexity can kill an enterprise project and collaboration also. Probably the forth one I would think is maybe resources and coverage. So try to get all the challenges that many enterprise organizations have to make sure they have the skilled resources and who is accustomed to agile testing and test automation. How do you handle the resource coverage piece That probably caused a lot of these failures as well for these projects.

[00:24:16] Autumn So we're seeing this too often. I mean, you nailed it with that. Enterprise testing teams are struggling to have experienced test automation experts available as well. So this is one of the primary challenges. So it goes back to understanding the business. Who do we have in place today that can actually adopt and currently use these types of tools? Or where is the miss or how do we train them properly and what will those efforts takes? You'll be surprised. We'll go through an evaluation. Then we get to the implementation phase and like, okay, how do you how do you plan to implement this and their like we have one guy like Gary, and so you need more support and you're like, okay, So it goes back to that collaborative team effort to identify where will there be gaps once we once we adopt this, And also what are the ways in which we can acquire the additional resources. Maybe it's through that team that we're working with that's providing us with this test automation tool, or maybe it's outsourcing some resources and helping secure those gaps because there's a lot of resources out there to withstand this these needs, but the maintenance will kill them alone if you don't have that in place. I mean, this just with what they're already supporting today, they're struggling. And then it's important to incorporate test automation. But again, if this is newly adopted into a team and they don't have the right resources, it can be a challenge up front. So a lot of that planning needs to be properly vetted out so you can roll that out effectively.

[00:25:51] Joe I think testers need to take a little more ownership of this. A lot of times I was the guy with automation and I couldn't take any days off or people would put me on code reviews then mark me as a blocker because it was so busy. So it's in your best interests also maybe to train up your team as well, to share the wealth, to get the whole team involved, it sounds like.

[00:26:08] Autumn I agree. I mean that I think our job, though, as the individual that are experts within the industry for providing test automation solutions to our clients is helping them understand this. And I feel confident taking that responsibility because typically the individual that we're working with to be adopting a solution would be a director to IT manager. And so if they're not adequately aligned on this, it can be a challenge and you need to train them. So to your point, like you might have been leading a team and you need to feel confident in the solution to be passing those learnings on to to your team. So I think if you're effectively doing that, it will really help your team be successful in the long term.

[00:26:46] Joe Absolutely. Also, you mentioned training. Product training is critical. Once again, I work for a company that uses an enterprise tool. And they didn't know all the functionality that was in it. So they will spend months creating their own methods to do stuff that was already built in the tool. They didn't know the tool created it for them. So do you see that as a gap as well as like people, they buy a tool. Okay, here's a tool. Go at it. Yes. Okay. Let's roll this out so you know exactly what the tool does and how it can help you.

[00:27:09] Autumn I do. But that's why it's important to know, are we selecting not just are we selecting the right tool, but are we are we are we investing in the right organization to help us prevent that? Because going back to it's the organization's responsibility to help those clients to be successful. So here's what this will look like, rolling it out. Here's how we flesh out that plan and here's the way that we continue to support you, because you're going to need that. And again, it's no fault to the teams like what you were saying. You had such a large plate already. So bringing on a new test automation solution, that's it. But the heavy lift at times. So you need to have the right resources available to through those organizations. So it's our job to provide that sense. I think ensuring that that is available to the teams is is pertinent.

[00:27:59] Joe Ok Autumn. Before we go, is there one piece of actual advice you can give to someone to help them with their automation testing efforts and what's the best way to find contact you? Or learn about more about Keysights Eggplant or other products by Keysites.

[00:28:10] Autumn Joe I think. The most important piece of advice is ensuring that test automation is incorporated at the early stages within testing. So again, we talked about that shift and shift left mentality and I think ensuring that that is brought in as early as possible to help alleviate a lot of that manual testing will greatly help so many teams to prevent them from making the same mistakes. And lastly, I would just say any of these C-level execs or key single key stakeholders tuning in, ensuring that you are supporting these QA teams and developers so you can make the proper advancements when it comes to testing to alleviate these challenges for their team.

[00:28:51] Autumn And to look for more information or contact us, you can go to Eggplant.com. Feel free to reach out if you have any questions.

[00:28:59] Joe Thanks again for your automation awesomeness. For links of everything of value we covered in this episode. Head on over to testguild.com/a433. And if the show is helping you in any way, why not rate it in Review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe. My mission is to help you succeed at creating and Fullstack Automation awesomeness. As always, test everything and keep the good Cheers!

[00:29:35] Joe Thanks for listening to the Test Guild Automation Podcast. Head on over to TestGuild dot com for full shownotes. Amazing blog articles and online testing conferences. Don't forget to subscribe to the guilds to continue your testing journey.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Nicola Lindgren Vernon Richards TestGuild Automation Feature

The Software Tester’s Journey with Nicola Lindgren and Vernon Richards

Posted on 12/22/2024

About This Episode: Today, we dive deep into how to advance your career ...

Alex Kearns TestGuild DevOps Toolchain

Leveraging GenAI to Accelerate Cloud Migration with Alex Kearns

Posted on 12/18/2024

About this DevOps Toolchain Episode: Today, we're diving deep into how you can ...

Three people are pictured on a graphic titled "AI Secrets You Should Know." Set against a striking red background, the image features the ZAPTALK logo in the top left corner, highlighting discussions on AI and automation.

The Secret to Embracing AI and Automation (ZAPTALK EP 02)

Posted on 12/17/2024

About Episode Join Alex (ZAP) Chernyak, Joe Colantonio, and David Moses in episode ...