The Key to Mobile Testing (Hint: Analyze, Analyze, Analyze) with Simona Domazetoska

By Test Guild
  • Share:
Join the Guild for FREE
Simona Domazetoska TestGuild Automation Feature

About This Episode:

Welcome to the TestGuild Automation Podcast, where we dive deep into the world of mobile testing. In today's episode, we have a special guest, Simona Domazetoska from Tricentis, joining our host, Joe Colantonio, to discuss the future of mobile testing, AI, and accessibility.

Simona brings over five years of experience in the software quality assurance industry, specializing in leveraging mobile and AI test automation technologies for IT modernization. Simona and Joe explore strategies for scaling mobile application testing in enterprise environments.

Throughout the episode, they uncover various variables that can impact the performance of a mobile application, ranging from performance and UX-related issues to network and connectivity challenges. They delve into the complexity of testing mobile applications, considering different industries and specific KPIs or metrics for measuring performance. Simona also shares insights on the importance of visual and accessibility testing, as well as the role of machine learning in predicting and diagnosing performance issues.

Additionally, they touch upon the challenges faced in mobile testing, such as the need for comprehensive analytics and metrics. So, join us as we explore the key insights and trends shaping the future of mobile testing with Simona Domazetoska on the TestGuild Automation Podcast.

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Simona Domazetoska

Simona Domazetoska

Simona Domazetoska has 5 years’ experience working in the software quality assurance industry with a special focus on how organizations can leverage the latest mobile and AI test automation technologies to drive IT modernization initiatives at speed. Simona spends a lot of time crafting messaging, go-to-market strategies, and product research for the Tricentis platform.

Connect with Simona Domazetoska

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:25] Hey, it's Joe, and welcome to another episode of the Test Guild Automation Podcast. And today, we'll be talking with Simona all about the future of mobile testing, AI accessibility, and beyond. If you don't know, Simona has over five years of experience working in the software quality assurance industry with a special focus on how organizations can leverage the latest mobile and AI automation technologies to really help drive IT modernization initiatives at speed, which is really critical, especially if you work in an enterprise. How do you scale, especially when you have all mobile types of applications? You need to test? We've got you covered in this episode. Simona spends a lot of time crafting messages. Go to market strategies and product research for the Tricentis platform. She has a lot of knowledge with a lot of companies doing a lot of things, especially around mobile. So if you want to learn more about the future of mobile testing, you don't want to miss this episode. Check it out.

[00:01:16] This episode of the TestGuild Automation Podcast is sponsored by the Test Guild. Test Guild offers amazing partnership plans that cater to your brand awareness, lead generation, and thought leadership goals to get your products and services in front of your ideal target audience. Our satisfied clients rave about the results they've seen from partnering with us from boosted event attendance to impressive ROI. Visit our website and let's talk about how Test Guild could take your brand to the next level. Head on over to TestGuild.info and let's talk.

[00:01:51] Joe Colantonio Hey, Simona, welcome to the Guild.

[00:01:56] Simona Domazetoska Hey, Joe. It's a pleasure to be here. Huge fan of your show. Thanks for having me.

[00:01:59] Joe Colantonio I appreciate. Really excited to have you especially to talk about mobile testing. But before we get into it, is there anything in your bio that I missed that you want the Guild to know more about?

[00:02:08] Simona Domazetoska I think you mentioned it quite well. Thank you so much for that lovely introduction. So yeah, I've been working at Tricentis for about five years. Time flies. And yeah, I'm in the quality space for quite a while. So thanks again for the intro.

[00:02:21] Joe Colantonio Absolutely. So you seem to have a specialty around mobile testing. Why mobile testing? Is that something you're focusing on exclusively or is it just something on your passion areas?

[00:02:31] Simona Domazetoska It's something that we've been focusing a lot at Tricentis and mobile testing is a really critical initiative. So part of what I do at Tricentis looks at how businesses are accelerating the digital business initiative. So mobile and particularly those that are adopting mobile first strategies critical. So that's certainly been fascinating to me and I've learned a lot through the project that we've been working on at Tricentis. So I think mobile is not just about software, it's about devices, and that goes into many different areas that I guess we'll be talking about. But that's just basically why I'm into this topic.

[00:03:12] Joe Colantonio Love it. Love it. So you mentioned one key difference. It's not just about the software, it's also about devices. In general, how would you say mobile test automation differs from maybe software testing that most people may be more familiar with?

[00:03:24] Simona Domazetoska Yeah. When you're testing mobile applications, you're not just testing the software, but you're testing how the software interacts with the hardware, with the actual device. And there are lots of different things that can be happening within the device. So if you think about it, you could be switching from a 3G to a 4G network. You could be logging into a public Wi-Fi, you could be receiving SMS notifications, the device might have low battery, you could be switching phone carriers. And not just that, but of course, users are in many different locations. They could be at an airport at an underground metro station. You could be at a large shopping center or a football stadium or even at a remote oil refinery. So I think that's precisely the challenge. How do you ensure that your mobile applications are working well, high-speed, and all these very complex, dynamic scenarios that any number of users could be in?

[00:04:22] Joe Colantonio Love it. So I know you speak to a lot of customers like you mentioned, and you notice a lot of the companies going through some sort of business acceleration. How many of them are aware of the challenges of mobile or that they actually need maybe a mobile testing strategy? Is that common now or is it not as common as you would think?

[00:04:38] Simona Domazetoska Yeah, it's a really good question, Joe. It really depends on the organization and the size of the company and the industry they're in. So you got to break it up a bit and ask, for instance, how mission-critical is mobile to the face of their business. So typically the more consumer-focused and the more paramount the role that digital experience plays for the organization, the higher likelihood that they have already quite a mature mobile testing strategy in place. So if you think of retail or telecommunications or banking or public sector or even energy and utilities, any type of industry that's working on kind of capturing customers and ensuring that customers have a friction-free experience will already have some type of mobile testing in place. Having said that, though, some companies are less mature in their practices so they might be testing manually. We have seen companies that set up their own device labs, so they purchase a lot of those phones in-house and they set up their own device labs. So while that's great and maybe can cut you some resources short term, what ends up happening is that teams spend up more time maintaining those devices. And we actually did a study. It can cost up to 200 to 400K a year just to maintain devices because they overheat. You need to ensure that the security standards in place, so employees can leave. So there are really a lot of glitches without a purge. So if you really want to scale, you have to leverage real device cloud. Having said that, there are organizations, more mature organizations that have adopted a real device cloud. But then there's another challenge, right? And that challenge lies with understanding what exactly went wrong in the mobile application. And there's a lack of tools that really provide kind of analytics and very detailed metrics to understand what exactly went wrong. We think it's all about test test test, but actually, it's all about analyze, analyze, analyze, and fix quickly, ship it quickly. And that's what I think, there's so many things that could be going wrong. And that's one challenge that we see with mobile testing today.

[00:06:47] Joe Colantonio What are the analyzing piece then most people miss out on in there? Any key metrics is like the battery life of a device or like would have those types of key things people probably don't have insights into currently.

[00:06:59] Simona Domazetoska Yeah, there is an explosion of different variables and we like to group them into different types. So you could have performance issues, for instance, low page load, animation issues, it could be things like low frame rate, or and then you have UX-related issues like blockiness, blurriness colorfulness. How does that application actually look like? What is the background look like? Then you have network and connectivity issues, which look at different Wi-Fi and connections and geolocation coverage, and then finally device-related issues, which is what you mentioned, like battery drainage or CPU usage or memory usage and how the mobile application may be impacted by any one of those different factors. Of course, there are many I can't list them all, but that's really, and not just that, but I have to say there are sometimes even custom KPIs that are specific for certain industries, like if you look at shopping, online shopping, from booking to shopping cart to purchasing, That whole process, how does that whole end-to-end process look like if you look at making a flight reservation online so flight to payment right so different industries need different types of metrics and I guess that's sometimes the complexity with not only so many different mobile phones but different types of variables that we need to look into.

[00:08:19] Joe Colantonio If someone's running on a device cloud then and a test fails, it sounds like it's more than it could be more than just like it couldn't find a locator, it could be one and some of these numerous other things that you mentioned. I like performance, I guess, or screen refresh isn't working. So people are missing out on that insight. And even though if they may be using a device cloud and in a test fails, they still have to do research and they may take a long time because they don't have these types of analytics available.

[00:08:45] Simona Domazetoska Yes, that's true. I mean, depending on the different real device providers that they are with and the breadth and the comprehensive metrics that they provide. But I guess that's precisely the challenges that not all providers with of real device clouds have all those complex metrics available. And sometimes it comes up to the developer to really work more time on debugging. In fact, I think there is one study that says that developers spend about 65% of their time debugging, and that's not really the best use of time. So of course, this is where we can leverage AI and machine learning to understand what AI and machine learning can do is aggregate certain issues that are found and display them as kind of issue caught so that, developers can go in the tool and see where exactly it went wrong. How does that work behind the hood? Well, you have very often some type of AI framework or machine learning network that has studied the different types of issues that can arise with mobile applications. They study user flows and they aggregate that data and that provided that within the analytics platform so that developers can really tap into those and understand where the crash happened or what was the performance over time. What is there any trend analysis like how does the application differ between different builds or between different networks or geographies? And that's precisely what some not all real device clouds offer that currently.

[00:10:15] Joe Colantonio Absolutely. So also it sounds like with this build to help developers learn what they may have done, how they could have optimized their code so that they could not ignore but avoid these issues in the future.

[00:10:26] Simona Domazetoska Yes, exactly. And that's what it's all about. It's about improving. Right. And I think there are some cases where obviously developers when they're writing code, I think there was a study and pulling out these studies, Joe, as they come in. There is a study that says that per 1000 lines of code a developer may produce around 70 bugs. I'm not trying to incriminate developers here it's just that here is occur and so understanding how and what that led to is not always easy because then you have to go back to fixing that defect. And that takes about 30 times longer just to analyze the code rather than to have some type of analytics platform that can help you intuitively understand that.

[00:11:12] Joe Colantonio Absolutely. Also, I would think a lot of these things are hard to test for, like how do you test 3G versus 4G or different performance things based on what network you're on? So I guess what a device cloud should have these types of features as well. Does it make it easy? Does it let you know what type of tests you need to run? Does it run automatically for you, I guess the question? How does a tester know what to test with mobile in a device cloud?

[00:11:36] Simona Domazetoska Yeah, I mean, it really depends on how you define your user stories and requirements and what's important to the business. I think that's a very separate approach. But going back to your network question, how would you test the network? There are typically two different ways you can leverage a real device. You test the actual mobile app on that real device, which is physically sitting in that geography using a certain mobile network provider, but using a certain type of mobile network speed. So that's usually the more, I would say, expensive solution. But definitely the most accurate. You can then also leverage a real device cloud to simulate that network as well. Typically, what happens here is that you have the real device connected to some type of Wi-Fi, which is then connected to a router and then you have some type of network shaping software attached to the router. Another option and probably the most cost-effective and fastest way is to leverage a virtual device where you actually, it's not a real device virtualized device. It's it's a copycat of the mobile device. It's not the real device where you can actually simulate the network. But then that's not great because you might not have realistic or accurate results. So those are some ways that you can approach that.

[00:12:56] Joe Colantonio Yes. I guess you get into that debate then. Should I use a simulator, an emulator versus a real device? Are there any key differences then when you should use one over the other? I know you give a few just now, but are there any other ones that someone's trying to decide? Okay, should I use an emulator or a simulator or a real device, or does even matter nowadays?

[00:13:13] Simona Domazetoska Yeah, it's a really good question. And again, I would say it really depends on the organization. It depends on the team. That depends on their budget. How mission-critical is the mobile app to the face of the business? If you're talking to an online fashion retailer who they're trying to capture a large audience of the mobile app has to be friction-free. It has to be seamless to work top-notch, right? So I would say it's really important to combine both. You need to combine real device clouds and virtualized devices, otherwise called simulators and emulators. So when you use which one for what scenario, I would say a general rule of thumb is if you're trying to look for a cost-effective, highly scalable solution, something that's really quick and easy to set up. And also you just want to check the functionality of the app in the early stages of development, go for a virtual device cloud, but you don't see customers with virtualized devices in their pockets, right? You see them with the actual devices. So that's why it's also critical to if you want to have test all those real user scenarios that we spoke about earlier, you need to leverage your real device cloud. So of course it's more costly. But in terms of the different scenarios that you want to cover, you need to leverage it.

[00:14:30] Joe Colantonio Absolutely. So you mentioned one of the keywords or the buzzwords of The Year AI earlier, but we didn't really dive into I want to pull it out now. So I hear AI applied to a lot of things in automation. I'm just curious to know how you see trends in mobile testing as it applies to maybe AI in mobile. Is there anything specific you see as a trend that could really help but mobile? You mentioned some analyzing your data. Is there any other features or things that could help with?

[00:14:54] Simona Domazetoska Yeah, there are so many different things and I'm sure you've heard some of these, right? You're in the testing space a lot yourself. Visual testing is really critical. That obviously looks at looking at finding cosmetic bugs between different versions of the application, not just cosmetic bugs, but how does the application look? Is it easy to access, is it easy to navigate? Is it accessible? Which leads me to the next topic, which is accessibility testing. So obviously, organizations want to cater to different audiences. This is a topic that's actually quite close to my heart because I used to work in the humanitarian sector. And so I think like if you look at, we tend to think, okay, this only applies to people who may be blind or have hearing difficulties, but actually anybody could be experience some form of visual difficulty, right? You could be an bit older or have to wear glasses or whatever. There are so many different complexities. So testing for accessibility is critical. But going back to AI like, I think performance analysis is really critical. You can leverage machine learning to predict and diagnose performance issues across applications, that's one. You can do things like crash analysis where machine learning is employed to look at vast amounts of big data and identify patterns across different crashes. You can do trends analysis, as we mentioned earlier, where you looking at different runs across different devices and network conditions. And then also machine learning is used to detect glitches and anomalies in video and audio quality. So you have you would test the machine learning systems to identify those issues and then spit that data back to the developer to identify what went wrong.

[00:16:40] Joe Colantonio That's the question get us all the time. How do I test video/audio? So sounds like using this mobile AI-embedded analytics tool that would be able to bubble up those types of potential issues.

[00:16:51] Simona Domazetoska Yeah, correct. It is definitely possible to do that, right?

[00:16:55] Joe Colantonio Very cool. So, a lot of times you mentioned accessibility testing once again. Is there anything you see or know of that people can do to make their mobile devices more accessible before it actually goes to production going, Oh boy, it's not not accessible at all.

[00:17:10] Simona Domazetoska That's a really fantastic question. And typically, I find people are becoming a lot more aware about accessibility in this topic. And what that means is before you end up building the final version of your application and then you're done and off with it. Developers are taking those accessibility standards and putting them early into their software development process. So that could mean things like embedding ARIA labels within the code. And ARIA labels, as we know, are used by visual assistive technologies like screen readers to identify different objects on a page where they actually call out those different objects, like this is a table, scroll, enter, or whatever. So embedding having that coding within the application is certainly critical. Not just that, but there are also many different manual. You can even do manual testing for accessibility, like looking at the application visually, it does the color makes sense. What about people with color deficiencies or issues? There are certain colors that are critical for the application that we may need to consider changing if it's red, for instance, or green, or so forth. Right? So you can do and implement all those user experience standards early on in the application. By the way, don't just take my word for it. There is WCAG guidelines are available online. There's a whole community of different accessibility experts and standards and guidelines that you can adopt for people wanting to learn more about it.

[00:18:41] Joe Colantonio There's also usually a debate of should I use native frameworks to test my app or should I use like a higher level all-purpose, like something like Appium to test my apps or something else? Where do you stand on that? Or where do you see customers using or what would your approach be typically?

[00:18:57] Simona Domazetoska Yeah, it's a really philosophical question. Yeah, it really depends on who is testing again and how is it being leveraged by the business. And so if you implement a say, you're using a codeless test automation tool. Obviously, the advantages there are that different personas within a larger organization can contribute towards testing different lines of business personas that may not have knowledge of scripting. And also it's not just about scripting knowledge, it's also about maintaining tests. As you know, the software updates. How do you maintain those in an easy fashion long term? Once you build a regression portfolio over time and you have thousands of lines of scripts, it can become quickly difficult to understand what went wrong. And not just that, but again, it goes to the breadth of testing. Are you doing end-to-end testing, you're doing unit testing? Or you're doing component testing. So if you're doing a really complex end-to-end process that touches multiple apps, I don't think scripted approaches are best. But then again, if you want to do unit testing, check code, I certainly think that scripted tools are better suited for this type of testing.

[00:20:13] Joe Colantonio Nice. So one of the things I always say, one of the reasons why a lot of folks have worked for vendors is you get to see a bunch of different scenarios, I'm sure a bunch of different customers. Do you notice like what the trend has been? Is there any trend? Who does the mobile testing nowadays? Is the developer's? Is it the testers? Does it depends?

[00:20:30] Simona Domazetoska Yeah, it's a really interesting question as well. It swings the pendulum swings between both, I would say. And again, it goes back to how testing is set up in the organization. Typically, what we find is that larger organizations that have a kind of centralized testing, center of excellence or strategy in place, it typically testing will be assigned to the testers who will implement testing. But then with quite decentralized organizations where testing lies within the agile teams or within the lines of business, it could be as well that it's the responsibility of the developer working side by side with the tester. I do think that when talking about the analytics components that we're speaking about earlier, that could be useful for both developers and testers. You want developers to be focusing on building the application, but they should absolutely be aware of the quality. So that's where they can leverage those analytics platforms and collaborate with testers real time to improve the way that they code, not just that, but really have insight into the application from a business standpoint.

[00:21:38] Joe Colantonio Great. So I do also have a news show podcast and I've noticed that Tricentis has been acquiring a lot of cool tech lately. A few years ago now, I took they acquired Testim and I think recently, they acquired a solution called Waldo. Just curious to know, how does that all work? How does that help expand? I think you already have Tricentis device cloud. Maybe we could talk a little bit about, we've talked about all these pain points. Maybe could talk a little bit about a solution, like Tricentis device cloud and how all these acquisitions, how they're coming together to help test or a lot of these really tough things that people are struggling with.

[00:22:12] Simona Domazetoska Okay. I can tell you that the strategy at Tricentis is to provide a really comprehensive mobile testing platform, and that looks at all aspects of mobile testing, not just testing on real devices, but also on virtual devices, not just having low code scripted approaches, but also having no code approaches. With aiming to cater towards different sizes and segments of companies. Obviously, we've been quite enterprise-driven, but we also slowly focusing on the SMB space. So we're trying to build a platform that's very comprehensive because why is that important? And it goes back again to serving our customers because really it's all about our customers. We find that customers tend to embed both real device testing and virtual testing for different use cases, right? So if it's something that they want more, as we mentioned, cost-effective, easy-to-scale solutions, there are more likely to embed virtual testing. And then if you want to test more deeply, get more accurate results, you will combine that with real device testing. But that's what we're aiming for to really have a holistic strategy in place. So that's really part of what we're doing with Waldo. Yeah, so I hope that covers it.

[00:23:33] Joe Colantonio Yeah, it might also have something virtual SMB space. What is that?

[00:23:37] Simona Domazetoska We find that small to midsize companies might not always more budget-focused, so they might not have the resources in place to test across real device clouds, which tend to be a bit more costly. Having said that, they are also even able to test across real device clouds because obviously there are different types of models in place. You could have access to devices or dedicated devices through private clouds, otherwise called the single tenant option. That tends to be a bit costly, but then you can also leverage a shared device pool, right? So most real device providers out there offer a shared pool of devices that different organizations can access, and that tends to be the cost of their approach. So typically for smaller and mid-sized companies, we do strongly recommend combining both testing across virtual devices, but also real devices and particularly leveraging the shared pool.

[00:24:37] Simona Domazetoska Nice. I just assume when we talk about mobile testing, a lot of people think about apps, but I know IoT has been out there for a while. Do you see people struggling with, okay, I have an IoT device, How do I test this thing? How do you handle IoT-type devices then type testing as well?

[00:24:52] Simona Domazetoska IoT is a very hot space and that goes hand in hand sometimes with 5G technology. All right. So 5G networks is out. It's rolled out. Obviously, we need to leverage high speed, low latency advantages of 5G, and that goes hand-in-hand with IoT. So if you look at, for instance, the automate most of the industry sometimes you have smart vehicles or you have this kind of in-car experience. Sometimes you might need to test how mobile interacts with these devices and these other hardware systems to tap into IoT. You could also have in manufacturing, you have smart factories, right? So 5G supports IoT and looking at machinery and tracking how devices are communicating real time. So you could even have things like agricultural, precision farming, right, where you have drones and machinery all communicating together to understand fields and crops real-time. Personally, I find this really fascinating space. There's just an endless amount of use cases, right? Gaming, retail, if you look at retail, I mean, nowadays you don't even to go to a store. You could try the actual clothes online using augmented reality applications which leverage 5G networks. These cases are enormous. And I think some of the top kind of more innovative companies are certainly adopting IoT to test them across different devices and networks.

[00:26:21] Joe Colantonio AR/VR, you just mentioned it when you're trying on clothes, I assume, does this have anything to do with running something in the cloud as well as their real devices for AR/VR as of yet? Or is that still something that may be a gap?

[00:26:34] Simona Domazetoska Yeah, that's a really good question. I think maybe there's something cooking in the kitchen.

[00:26:40] Joe Colantonio Nice. Cool, cool, cool.

[00:26:42] Simona Domazetoska But yeah, I would be happy to hear what you would think about this, Joe.

[00:26:47] Joe Colantonio I think it'd be great. I actually talk to a lot of developers. They have kids and they have their kids test for them. So having a more scalable solution I think would be very, very helpful.

[00:26:58] Simona Domazetoska Definitely. It's a very exciting field.

[00:27:00] Joe Colantonio Absolutely.

[00:27:01] Simona Domazetoska So something that we're looking into.

[00:27:03] Joe Colantonio Awesome. Okay, Simona, before we go, is there one piece of actual advice you can give to someone to help them with their mobile automation testing efforts or device cloud testing efforts? And what's the best way to find contact you or learn more about Tricentis solutions that actually help with the mobile testing space?

[00:27:20] Simona Domazetoska Sure. I would say put the customer in the centerpiece of what you do and really understand what is the customer doing. What are they trying to accomplish? And then don't just test, analyze, analyze, analyze, and really understand what's going wrong. So that also requires communication. We shouldn't forget the human component of testing, not just testing, but developing and beyond. And you know yourself, right? We have AI and machine learning. It's such a buzzword. It's like you have to talk about it. But we shouldn't forget the human component of what we do. I think that's really critical. In terms of how you can reach out to us. So you can head over to our website, you can actually try Tricentis device cloud. We have a free trial available and then obviously you can contact us directly through our website. If you want to learn more, we can put you in contact with our product experts to provide those demos.

[00:28:15] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a463. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:28:51] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Nicola Lindgren Vernon Richards TestGuild Automation Feature

The Software Tester’s Journey with Nicola Lindgren and Vernon Richards

Posted on 12/22/2024

About This Episode: Today, we dive deep into how to advance your career ...

Alex Kearns TestGuild DevOps Toolchain

Leveraging GenAI to Accelerate Cloud Migration with Alex Kearns

Posted on 12/18/2024

About this DevOps Toolchain Episode: Today, we're diving deep into how you can ...

Three people are pictured on a graphic titled "AI Secrets You Should Know." Set against a striking red background, the image features the ZAPTALK logo in the top left corner, highlighting discussions on AI and automation.

The Secret to Embracing AI and Automation (ZAPTALK EP 02)

Posted on 12/17/2024

About Episode Join Alex (ZAP) Chernyak, Joe Colantonio, and David Moses in episode ...