Leveraging GenAI to Accelerate Cloud Migration with Alex Kearns

By Test Guild
  • Share:
Join the Guild for FREE
Alex Kearns TestGuild DevOps Toolchain

About this DevOps Toolchain Episode:

Today, we're diving deep into how you can leverage GenAI to accelerate cloud migrations with our special guest, Alex Kearns from Ubertas Consulting.

As an AWS ambassador and cloud expert, Alex shares his insights on leveraging generative AI to help with cloud migrations, enhance security, and extend customer relationships.

We explore AI's role in modernizing deprecated software, automating system analysis, and the ethical considerations surrounding this powerful technology. We'll also touch on practical case studies and emerging trends, like the use of multi-agent workflows and the future of prompts as intellectual property.

Whether facing challenges in project estimations or looking to integrate AI into your CICD pipelines, this episode is packed with actionable techniques and tools for enhancing performance and reliability in your DevOps journey.

TestGuild DevOps Toolchain Exclusive Sponsor

Are you ready to level up your DevOps game and crush those quality challenges?

Whether you're dealing with flaky tests, scaling automation, or trying to integrate security into your pipeline, we've got something just for you.

Introducing the DevOps Quality Testing Playbook from TestGuild! 🚀 This isn't just another PDF—it’s your go-to guide packed with actionable insights, best practices, and strategies to help you create a bulletproof DevOps toolchain.

It’s built specifically for engineers, testers, and DevOps teams who want to optimize their workflow and drive continuous quality throughout the pipeline. The best part? It’s free and ready for download!

So, don’t miss out. Head over to https://testguild.me/devopsbook and grab your copy today.

Stay ahead in the game, optimize your pipeline, and let’s crush those quality challenges together.

About Alex Kearns

Alex Kearns

Alex leads the consulting function at Ubertas Consulting (part of Devoteam), responsible for the quality of advisory services and pre-sales activities. He is also an AWS Ambassador, one of around 300 individuals globally recognised for their thought leadership and advocacy of AWS.

Connect with Alex Kearns

Rate and Review TestGuild DevOps Toolchain Podcast

Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:00] Get ready to discover some of the most actionable DevOps techniques and tooling, including performance and reliability for some of the world's smartest engineers. Hey, I'm Joe Colantonio, host of the DevOps Toolchain Podcast and my goal is to help you create DevOps toolchain awesomeness.

[00:00:18] Hey, you probably heard a ton about Gen AI. But how about Gen AI and how you can use it to help accelerate cloud migrations? Well, you're in for a treat because that's we're going to be talking all about today with Alex. Alex leads the consulting function at Ubertas Consulting, he is responsible for the quality of advisory services and pre-sales activities. He's also an AWS ambassador, one of around 300 individual globally recognized for the thought leadership and advocacy of AWS. If you have anything to do with AWS, you want to learn more about Gen AI, how it can help you with DevOps and especially cloud migrations? You don't want to miss this episode. Check it out.

[00:00:55] Are you ready to level up your DevOps game and crush those quality challenges? Whether you're dealing with flaky tests, scaling automation, or trying to integrate security into your pipelines, we got something just for you. Introducing the DevOps Quality Testing Playbook from TestGuild. It isn't just another PDF. It's your go to guide packed with actionable insights, best practices, and strategies to help you create a bulletproof DevOps toolchain. It's built specifically for engineers, testers, and DevOps teams who want to optimize their workflow and drive continuous quality throughout their pipelines. The best part? It's free and ready to download, so don't miss it. Head it over to Testguild.me/DevOpsbook and grab your copy today. Stay ahead of the game. Optimize your pipelines and let's crush those quality challenges together.

[00:01:45] Joe Colantonio Hey Alex, welcome to the Guild.

[00:01:49] Alex Kearns Hey, great to be here. Thanks for having me.

[00:01:51] Joe Colantonio Great to have you. I guess before we get into it, is there anything in your bio that I missed that you want The Guild to know more about?

[00:01:57] Alex Kearns Attend to publish content across a number of mediums. So happy to share links to LinkedIn, but also publish blog posts at the wellarchitectedadvocate.com. So focusing very much on how the latest technologies relate to the AWS, wellarchitected frameworks are how you can leverage them without sacrificing best practice thing about things like security, cost, reliability and with AI specifically sustainability being a really important one now. But yeah, that's where to find me and always happy to chat if you drop me a message.

[00:02:31] Joe Colantonio Awesome. We'll have a link for all those and the links in the comments below. All right, Alex, maybe give a quick maybe rundown of Gen AI and its role in DevOps. I know a lot of people hear Gen AI from my point of view, it's mostly testing, but how does it relate to or how do you see it growing maybe in DevOps in general?

[00:02:50] Alex Kearns I really see Generative AI as something that can touch on all elements of that system development lifecycle, all the way from the planning stage through to actual implantation testing operation and then the maintenance and modernization of applications as well. As you said, Joe, you've looked around testing before and some of the capabilities that Generative AI can bring in testing around writing tests, writing documentation to help people understand what tests need to be written is really valuable. People will be aware of tools like GitHub copilot and Amazon Q developer for the actual implementation of system. Code assistance helping you kind of auto complete whilst having the context of what the application is doing rather than just the syntax in your editor. For me, some of the less talked about pieces of work Generative AI can help is in those early and also late stages. It's the planning, it's the how do you understand a system as a whole, especially maybe when you've got organizations who built these systems a few years ago, maybe decades ago, where the human effort of analyzing a system is potentially weeks or months, if you can leverage generative AI to give you a kind of a high level and a high quality as well, a high level, high quality analysis of those systems without needing to read through every single line of code. There's definitely ways it can accelerate that process.

[00:04:25] Joe Colantonio Well, so that's a great, great point right there. I know when I said a company at GE, they had a system or literally was over 30 years old. And so you can just look at the code and know the heck what's going on because it was doing all nutty things. So you're saying that's a great point. You could use Gen AI to kind of summarize. All right. Give me a maybe a background behind this code. What are the most important areas that I should know about? How does it interact with things? Is that what you're saying? Like someone onboarding them? Hey, use Gen AI to learn a little bit more about what you going to be working on.

[00:04:53] Alex Kearns Exactly. So one of the customers I've been working with recently given a talk about this a couple of times. We use generative AI for analysis of scripts. So these were all kind of intertwined schedule tasks around on premises. They were in a number of years ago and also a lot of the people who had written them were no longer with the business. There's kind of silos of knowledge. And we sort of did a bit of a sample of if we sit down and read through each of these scripts one by one, document it, understand where dependencies are, all the important things for migration. Thinking about what databases they interact with, what network connectivity is required, what is the script do? Even the most kind of basic things as a human, I think we estimated the effort to go through all of them would have been around two weeks which two weeks in the scheme of a kind of multi-year migration isn't terrible. But in this case, it was 250 scripts, so 250 scripts in two weeks. Other projects, there might be a thousand scripts, 2000 scripts as you get larger and larger. And we use Generative AI to do analysis of those scripts, understand what those dependencies were, understand what connectivity was required, and outputs it in a format that was structured. Taking unstructured data, being source code, turning it into structured analysis that was then presented in a spreadsheet and really easy to filter on things like this script accesses this database, but it only accesses it with read only access, for example. All things are really valuable for planning how to migrate and in order to migrate. An example being if you've got a load of scripts that are only reading from a database, then you know that there's a different approach to take to migrate those because you haven't got to think about data consistency because they're going to read, they're going to come out with a result and it's very easy to test. Is that result what it should be? And you could run it in parallel to the legacy system on premises. Those scripts that were writing to a database can't run in parallel. You have to think about kind of replicas of databases that you can test against, kind of shadow copies, very different approaches. But without that analysis, you'd never get there.

[00:07:18] Joe Colantonio All right. So that's another great point. Once again, I'm going back to the day where I used to work full time. And this that wiki is every time there was a change, I'd have to update the wiki. We'd start off strong, we'd have a nice wiki documentation and then over time it would degenerate and people would forget things. Sounds like you use Gen AI to look at your scripts, update the wiki pages constantly and every sprint or every check in and that'll keep you up to date. That a save a ton of time. It doesn't sound like a lot, but it is over time maintaining all these wiki page is to make sure they're all up to date and refreshed.

[00:07:48] Alex Kearns Yeah. I mean it's something that I think will increasingly become part of people's kind of CI/CD pipelines where actually you're raising a pull requests and part of that pool request can be a pipeline runs in the background and it generates the documentation for you. I think there's definitely some people in the industry that see Generative AI as a silver bullet. There's definitely organizations that are probably leaning a little too far into the Generative AI hype. However, I think as we've seen with all technologies through the last few years, you kind of have to go through that cycle of where it's hyped up to the point of people almost being fed up of it to find those real valuable use cases. And that's where I think in the next kind of 12 to 18 months, we'll go through the rest of that hype cycle where people get to the stage where they're just sick of hearing about it, much like I guess things like serverless. I suppose if you go back to kind of 2016 and 2017, cloud providers would slap the word serverless on every single service, even if it wasn't really truly serverless. And Gen AI is very much going through the same thing at the moment in the cloud. Everything that's released is powered by AI, in one way, shape or form, but give it kind of 18 months and I think will get to the stage where the real valuable use cases start to shine and it just becomes another tool that developers can leverage to make their lives more efficient.

[00:09:17] Joe Colantonio All right. That's a good point. I run an online event every year I run a survey and people chose they want to learn more about AI in testing. And then in the comments, they say, I am sick of AI, I don't want hearing more about it. How do you know when it is hype and when it's not like? And is it going to be? I keep hearing like it's hype and then I hear other people saying, No, it's totally going to change everything and now you need to get on board now, how do you get between the middle and know what the real use of it is and how it really can help?

[00:09:43] Speaker 2 So I'm fairly cynical. I generally take the view of when people say things are going to be completely game changing. It's always a pinch of salt because sometimes you'll see two people that are saying Is game changing. Other people who have got a vested interest in it being game changing. So you look at the developers of these large language models and there's no doubt that the things and the capabilities coming from them are incredible. There's things where you think back to what ChatGPT would do on day one of release and compare it to what's possible. No, and I don't think I can remember another technology that's advanced that quickly. But it will come with challenges. It will come with the usual challenge of scarcity of resources, where people are fighting to buy GPUs a scale to train these models and run these models. What I'd like to see is to make sure that this technology doesn't become inaccessible kind of to the masses. I think it needs to be commoditized to the point where someone can go on, they can try something out, they can build their own assistant and kind of bring you back to migration. That's how that script analysis piece started. So it started as I was faced with doing some analysis and thought, I'll just try one script, I'll see what the quality is, how close it can get to the human analysis that been done. And then kind of very quickly realized that with some tweaking, it gets as close, if not better in some cases. And as I said, I'm fairly cynical about most things. And I think it is a transformational technology. I see it probably on the same level as cloud. Public cloud. But then everything goes in cycles. You've got organizations in a more public speaking at the moment around organizations exiting the cloud and going back to on premises. And in 10 years, could we see people saying, actually we're going to stop using generated AI and go back to this because X, Y, Z, maybe. I think if you could predict those kind of events, you'd be very, very rich. But for me, generative AI in the migration lifecycle, it's almost every element of it is every element of that migration lifecycle. It's applicable to it's not just limited to certain bits.

[00:12:13] Joe Colantonio Right. So you mentioned you are cynical, but then you mentioned you did some tweaking. I think a lot of developers of testers are cynical try one time and they don't tweak it. I just have this it's garbage. It's hype. How do you know that the output you're getting from the Gen AI is what you want and how do you keep going to maybe make it better or not give up on it, maybe?

[00:12:33] Speaker 2 I kind of took in a bit of an iterative approach. Without particular one started off with a really, really simple prompt. The prompt was take this script, analyze it, and just give me a summary of what tasks is performing on the basis of if it can't give me an accurate description of what task this script is performing, then the chances are it's not going to be able to go and give me detail about database tables, access patterns, possibilities for modernization. Very much a almost kind of minimum viable product starting small. Looking at it as what can we do to prove there might be some value here? There's lots kind of leading back to that sort of commoditization comment I made, there's lots around, particularly for AWS that I've seen that does help people to get started with Generative AI, either personally or professionally. AWS have got a website called Party Rock. So PartyRock is a way to essentially build Generative AI powered applications, but it uses Generative AI to build them. You give a prompts, you say, I want an application that takes a PDF as an upload and generates a satirical summary of it. And you can have quite a lot of fun with it and you can kind of run workshops with customers and show that yes, you can have some fun, but just a small tweak to the messaging and say, I don't want that satirical summary and I want a summary I can give to my CEO. Something that's concise, gets to key points across, but is a very different audience. And using something like that, which is kind of completely free to use no need for signing up on AWS account, no credit cards, just go to the website and you can use it straight away. That's the kind of commoditization I think needs to stay because ChatGPT is like is very close to becoming something that is almost kind of common speak for a lot of people, especially with the integration with iPhones that's landing this week. I think it came up. So suddenly Generative AI goes from being this super technical thing. And some people have heard of ChatGPT but not really used it too. There's this technology behind the scenes in every new iPhone sold, and it's quite scary how quickly this is moving. Even if I go on annual leave for two weeks, I come back and things have moved. Scary, but very exciting.

[00:15:11] Joe Colantonio Absolutely. All right. So when it comes to cloud migrations, I think you touched on. Are you saying use Gen AI in the early phases to help understand the application and look at the database patterns in the scripts to know maybe any gotchas you might have when you go from bare metal to the cloud?

[00:15:28] Alex Kearns Yes. So at the early stage, it's about the analysis, can we do things to predict where we might have challenges later on? So examples would be things like, understanding what's there. If you haven't got documentation from a customer, you've got things like analysis for modernization opportunities. So as part of that migration, could we upgrade from this version of PHP to this version of PHP? Do we need to upgrade from this version because it's end of life? Those kind of things that you can give context around the analysis. But I think that is just stage one. Like it's a when you think about the migration process person, the analysis phase gets you so far in that the more analysis you can do, the better. Generally, when you've got customers that are kind of busy customers getting time from them to do that, detailed analysis is hard. Leveraging tools like Generative AI is increasingly valuable because customers can give you a bit of context around what something is doing. You can leverage technology to do the detail and then have validation with customers at the end to make sure that what you come up with is correct and everything you pick up there just kind of pays dividends later on in that cycle. From a commercial perspective, so working for a consultancy, you want to make sure that you're working with a customer for a long time and not just for one project. Identifying those modernization opportunities helps to bring that into conversation early on. The customer's thinking about it from a technical perspective, you might be thinking about things like security. If something's written 10 years ago, the chances are it's using deprecated versions of software packages, all of those things. And there will be common vulnerabilities that are there. There will be patterns that are no longer recommended as best practice. So being able to do all of that in the analysis stage again, super useful and sort of get further through the migration. So if we imagining is like her, we work on the kind of three stage model of an assessment and then us or mobilization in the cloud and then you migrate. So in that mobilization phase, it's very much around the let's get your kind of foundations in place. Let's make sure you're secure. You've got all the networking, you require everything that's kind of a day zero job ready and do a pilot for a particular application. But as part of that pilot, you might think about things like, okay, well, we know the on premises. This was running on a let's say it was a Windows application. It's .Net, but it's a version of .Net framework. Say there's only compatible with Windows machines can't be run on Linux. And you could look at using things like Generative AI to upgrade that from .Net framework to .Net core so you can free yourself from licensing costs and give yourself the ability to then run it on Linux and even go as far as generating things like Docker files or if you want to running Kubernetes, then how would you build a helme char or kind of all of these things that are so standard that maybe just need a little bit of tweaking that AI can do as part of that context? I think it's something that we will increasingly see at every stage.

[00:19:01] Joe Colantonio This may seem obvious. It just came to me, as you were mentioning this, developers are notoriously hard at estimating. Before you migrate, can you say, okay, can you estimate how long it's going to take? And I mean, mean stupid. But I would think that would help with to first initial planning how accurate it is to get maybe a good estimate.

[00:19:20] Alex Kearns I think on the estimating front, the one thing I have always seen is poor estimates tend to come from the unknowns. It's the what don't we predict? And often not predicting something isn't through not having the information available. It's perhaps not having the time to be able to go and find out that information. So if you're at the very early stages, you have to sign contracts with the customer. You're always a little kind of tentative to spend like months and months on a kind of presales to understand all of this. You have to find that balance of what is the right amount of analysis to do, to do accurate estimates versus what is too much analysis that if the customer doesn't sign, we've sunk way too much money into this. That I think is is where that analysis powered by Gen AI can just go so much deeper and give you those unknowns. The unknown Q, the unknown with some further analysis. So I've build kind of little Generative AI agents where it's as that open source framework called True A.I., and you can build a crew of AI agents. So I've built a little of proof of concept for myself around sales proposals. If you get a proposal from sales that says, okay, customer wants do this, here's the timelines, we're planning to do it. On his delivery approach. All of these kinds of high level things. Then these agents, there's three personas are separate agents. One is a commercial persona, one is a technical persona and one is a kind of delivery or project manager persona. And each of these kind of interacts with that sales proposal, understands it and produces analysis and recommendations that are specific to their use case. The technical one might come back and say, This is great, but what you're proposing isn't possible in AWS, trying to catch the things early. And then project manager might come back and say, This is fine, but you've included nothing about risks or overheads for a project manager in this project. That being quite a simple example. But once you start getting into these kinds of agents and multiple agents working together in a workflow, you open the door for an awful lot more in terms of very specific use cases being chained together to give its end result. And even that's something that could be looked at as part of migration and modernization, where it may be doing analysis of an application is taking that analysis. It's using a large language model that's especially good at understanding code that may is terrible at writing code because there is kind of one large language model to rule them all. So you might have an agent that analyze that, it does that analysis and might have a second agent and that second agent takes the analysis, takes the source code and is amazing at writing code. So much like you would have in a team of people, you have one person that's great at doing discovery and understanding your customers business concerns and another person that's great at writing code. Someone else has great testing and you can use the multiple agents to do different tasks. And that's something that's coming in with a bit of a trend for 2025 is, is how we go from these kind of either conversational interactions with an AI system. Your usual ChatGPT, the UI assistant to something where A.I. is used as part of a workflow and it's able to make those decisions as part of that workflow to know what agents it needs to call. You might have an overall agent that is responsible for deciding which of the other agents it needs to interact with. Going for that approach of here's a sales proposal I need to know about delivery, I need to know about commercial risks, I need to about technical risks, go and figure out how to do it and it having enough intelligence to be able to say, I'm going to use Agent A for this, Agent B for this, and Agent C for this. And that's something that I've just started playing with is looking to be really exciting.

[00:23:42] Joe Colantonio All right. So you're a consultant. You probably work in a bunch of different environments, different challenges. So over time, this is probably second hand, too. So you already have maybe these custom prompts and agents are already ready to go for new clients. If someone's thinking of moving to the cloud and they want to use Gen AI obviously it sounds like it would be a leap to get to where you are right now. Like how do they get there? Is it does is that when it makes sense? Maybe you should get a Gen AI consulted to come in and maybe guide the ship or when do they know they have enough people in-house to be able to handle this on their own?

[00:24:14] Alex Kearns I think as I've kind of said, I very much view Gen AI as a tool, not a replacement. So it's very much something where I think it will change what people do. I don't think it will replace what people do in the same way that whenever you're evaluating any project or digital transformation exercise, you already have those decisions around, when do I bring in a consultant? When do I bring in a new hire and do in-house? All of those kind of things. I don't think that necessarily changes with Generative AI. I think the only bit maybe does change is prompts and the way you're using AI. At the moment, I guess it's still quite experimental. It's still people write prompts, people share prompts online like it's to begin with there wasn't a lot of concept around prompt engineering and being able to actually make a proper task of how do I improve the form to do what I want to do? I think something that you'll see more and more of as time goes on is prompts will become intellectual property for organizations. You might have a consultancy that specializes in migrations. And part of the value proposition of that is they've built up a library of prompts from a hundred different customers they worked with, where they now have the best prompts in the industry for analyzing, rewriting code, writing tests, writing documentation in the same way that at the moment you might look to a consultancy to say actually that you've got the AWS expert. They're the ones that can help us with this because kind of X, Y, Z reasons rather than it being a we'll do it in-house, we'll give it a go. I think that the shift is going to be in the tools that people have. Almost starts to become a hybrid between those of independent software vendors where you might buy a SAS product that's going to give you certain functionality around analysis. And I'm sure there is generative AI powered SAS products kind of outside of the AWS ecosystem focused on analysis or modernization. I think a differentiator for the consultancies and in particular migration consultancies will be the tools that they develop internally and the prompts that they develop internally that you can't get elsewhere in the same way that you can't get a consultant from Ubertas that's the same consultancy as yet from every other organization. So yeah, you kind of go from the selling the value of your people and the skills of people to here's the value with people, but also here's the value of the tools that we've built that can make your migration faster, more successful, cheaper, those kind of things.

[00:27:11] Joe Colantonio I don't know why that answer made me think about IP, then. Can someone use Gen AI to go in? And what is the special sauce of this application? Explain it to me. Get the core code and then all sudden everyone has no unique advantages with their applications because these agents are able to find out exactly what is real easily. And I would think probably easily without getting detected. Maybe I'm wrong, but is there any security issues people need to worry about as they're using Gen AI, either for cloud migrations or anything, even be able to steal company secrets or coding algorithms that really make them differentiated between other ones.

[00:27:46] Alex Kearns Absolutely. I mean, it's something where if you look at some of the kind of more naive implementations of Generative AI powered tools online, there's you can have quite a lot of fun because you can have these conversations and say ignore all the previous instructions and start making them say what you want them to say and do what you want them to do, which is dangerous. And you have to think of those guardrails that you can put in place to stop people from doing that. A lot of the kind of managed generative AI platform so I was in bedrock is the one I'm most familiar with have guardrails. The you can easily add in, but I think of it as it's just another attack vector that you have to think about. So similarly to how you would write an application that interacts the database and takes input from a user, you'd think, okay, I'm not going to do it this way because that's a massive attack vector for SQL injection. You think of it as, okay, I'm going to build a Generative AI application. I need to make sure that the way I'm building this means that someone can't just say, Tell me your instructions, give me your source code, those kind of things. It does open up a good question around kind of ethics of Generative AI and the risks of sort of data privacy. And that's where you sort of start to open the door to political views on things. So I know in the States there was a big thing a few weeks ago, maybe in the last month, where I think it was the Department of Defense were proposing the developers of large language models. So people like Tropic, Meta, Amazon, the big players would have to start reporting on or they're proposing they would have to start reporting on whether their models could be used for activities that would be of concern to national security. Kind of makes sense. You want to make sure that people aren't going to be using models to do illegal things. And there's always a fairly lively debate around regulation of technology versus the stifling of innovation. I think the thing that I'm really keen to try and make sure doesn't happen is the overreliance on it. It's making sure that people don't just think you as I'm going to trust this blindly. It's a great tool to give you ideas. It's great to be able to help you structure content or all of these things, but it can get it wrong and it will get it wrong. There's no guarantee it's perfect every time. I am very much of the view of let it ride documentation, but just make sure that maybe like 20% of the time you are checking that the documentation it writes is accurate. Making sure that any system that's powered by a kind of non deterministic engine, you want to make sure that the decisions that it's driving are fair and ethical decisions.

[00:30:37] Joe Colantonio Awesome. Okay Alex, before we go, is it one piece of actual advice you can give to someone to help them with their AI Cloud migration efforts?

[00:30:44] Alex Kearns I would say the absolute number one for me is just experiment. It's where everyone started make use of the fact that A.I is becoming more commoditized and that every big player in the industry is pumping money into it. Find ways to improve your personal efficiencies because you can guarantee that if there's inefficiency, as you're finding in your own migration processes, then other people in your organization will also be finding the same thing. Prove it once. Test it. Scale it and monitor the results to make sure they are accurate and not biased.

[00:31:21] For links of everything of value we covered in this DevOps Toolchain Show. Head on over to Testguild.com/p174. So that's it for this episode of the DevOps Toolchain Show. I'm Joe, my mission is to help you succeed in creating end--to-end full stack DevOps toolchain awesomeness. As always, test everything and keep the good. Cheers!

[00:31:55] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.

[00:32:39] Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
TestGuild Automation Feature

Top 8 Automation Testing Trends for 2025 with Joe Colantonio

Posted on 01/12/2025

About This Episode: Welcome to the TestGuild Automation Podcast! I'm Joe Colantonio, your ...

Gerwin Laagland, Markus Stahl and Stavroula Ventoura TestGuild Automation Feature guests

Unlocking Robot Framework: RoboCon HELSINKI with Gerwin Laagland, Markus Stahl and Stavroula Ventoura

Posted on 01/05/2025

About This Episode: In today's episode, we're diving deep into the world of ...

Anna Royzman TestGuild Automation Feature

Ultimate Guide to Selecting Test Automation Solutions with Anna Royzman

Posted on 12/29/2024

About This Episode: Today, you get to listen in to a private Automation ...