About this DevOps Toolchain Episode:
Today, a special guest, Danny Lagomarsino from SmartBear, is joining us for an insightful discussion all about Breaking Down DevOps Testing Silos.
Streamline DevOps w/SmartBear's TestHub https://testguild.me/testhub
In this episode, Danny explores the transformative power of AI in software testing, mainly through SmartBear's innovative solutions like TestComplete. He shares fascinating success stories about using AI for object identification in complex enterprise software and the benefits of integrating various tools like Zephyr Enterprise, Bitbar, and LoadNinja into a unified testing hub.
We'll also explore how breaking down siloed teams and fostering collaboration can enhance productivity and alignment. Danny emphasizes the value of solid team foundations, continuous learning, and open communication in achieving organizational goals.
Plus, we'll touch on SmartBear's exciting AI roadmap and new tools in the pipeline.
Whether you're a seasoned DevOps professional or just starting out, this episode is a goldmine of practical insights and strategies. These are not just theoretical concepts, but actionable steps that can help streamline your testing efforts and enhance your DevOps toolchain.
Tune in and join the conversation as we aim to bridge gaps and drive efficiency in the ever-evolving world of DevOps and testing.
TestGuild DevOps Toolchain Exclusive Sponsor
If you've ever felt the frustration of siloed testing teams and the chaos of misaligned goals, then you know how crucial it is to have a centralized hub for all your testing needs. That's where SmartBear's Test Hub comes in.
Test Hub isn't just another tool; it’s a comprehensive solution designed to bring your testing processes together. Imagine having all your test management, execution, and reporting functionalities integrated seamlessly. It’s not about adding another tool to your belt; it's about simplifying your workflow, improving communication, and driving your projects forward with greater efficiency.
What’s more, Test Hub's intuitive interface and robust features make it easier for teams to collaborate and stay aligned, no matter where they are. So whether you're just starting out with automation or looking to streamline your existing processes, Test Hub has got you covered.
Curious to learn more? Support the show and head over https://testguild.me/testhub and see how you can transform your testing practices today.
About Danny Lagomarsino
Dan is an experienced solutions engineer that specializes in software test management and automation solutions. He has been helping SmartBear customers adopt and accelerate their testing practices for nearly 5 years. Dan's current focus is on leading solutions efforts for our Test Hub and creating stronger customer relationships.
Connect with Danny Lagomarsino
- Company: Smartbear
- LinkedIn: www.daniel-lagomarsino-8b1443192
Rate and Review TestGuild DevOps Toolchain Podcast
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:17] Joe Colantonio Get ready to discover some of the most actionable DevOps techniques and tooling, including performance and reliability for some of the world's smartest engineers. Hey, I'm Joe Colantonio, host of the DevOps Toolchain Podcast and my goal is to help you create DevOps toolchain awesomeness.
[00:00:17] Hey, are communication barriers and misalignment wreaking havoc on your testing DevOps teams? In this episode, we're going to uncover the secrets to achieving greater organizational alignment with Dan, who is a seasoned solution engineer from SmartBear. From overcoming obstacles to leveraging AI and testing. Discover how clear communication, shared objectives, and innovative tools can transform your DevOps practice. Stay tuned for insights that could turn your team into champions for change. Check it out.
[00:00:49] But before we get into it, have you ever felt the frustration of siloed testing teams and the chaos of misaligned goals? Then you know how crucial it is to have a centralized hub for all your DevOps testing needs. And that's where SmartBear's Test Hub comes in. Test Hub isn't just another tool. It's a comprehensive solution designed to bring your testing process together. Imagine having all your tests, management, execution and reporting functionalities integrated seamlessly. It's not about adding another tool to your belt. It's about simplifying your workflow, improving communication, and driving your projects forward with greater efficiency. What's more? Test Hub's innovative interface and robust features make it easier for teams to collaborate and stay aligned no matter what they are doing. So whether you're just starting out with automation or looking to streamline your existing process, Test Hub has got you covered Cheers to learn more. Support the show. Head on over to testguild.me/TestHub and see how you can transform your DevOps testing practice today. Check it out!
[00:01:54] Hey, Dan, welcome to The Guild.
[00:01:56] Daniel Lagomarsino Hey, Joe. Good to see you. Thanks for having me.
[00:02:01] Speaker 1 Absolutely. Really great to have you on. I think one thing a lot of people I hear from on my podcast is dealing in silos. Still, even though a lot of teams are agile in communication, things like that. So I thought this would be a great topic to go over. And I guess one of the things I have in my notes is one of the biggest challenges you've seen. I think I've seen as well for teams, is that trying to achieve greater alignment across their organizations, but they're finding it hard to do. How does one overcome that, do you think?
[00:02:27] Speaker 2 I think, I'm going to start by talking about waking down communications barriers. And I think it is something that is not just for aligning larger teams, but also for small teams. So I think missing communication barriers is going to be a theme of this one to be honest.
[00:02:42] Speaker 1 All right. Nice.
[00:02:44] Speaker 2 But yeah, also that when we talk about greater alignment, what we want to look for is the things without like a shared vision. We don't have objectives. Teams aren't going to prioritize your tasks properly. If everyone can get on the same page that way anything that we need to do, for example, if we need to allocate resources, if we can get management support involved, it all really stems down to breaking down communications barriers, talking to not only your team, but also the other teams out there, making sure that even inside of your own team, you're not siloed. Because if you have silos within silos, then everyone's just testing with blinders on. And that's not a really great thing.
[00:03:22] Joe Colantonio Absolutely. Yeah, I've seen organizations. It's almost like a mini waterfall where they're agile yet, they're mini waterfall, and they don't talk to other teams. And I don't know why that is. And it's interesting. I used to work in a company where one team would sit across from another team, and they wouldn't communicate with one another as well. So curious to know how do you then maybe it's a culture thing. How do you get the culture then to embrace the shared vision rather than just being a buzzword?
[00:03:47] Daniel Lagomarsino Yeah. It's very, very certainly can be a culture thing. It's really kind of can be difficult. If you are co sharing that information, maybe set up some clear communication channels, maybe a centralized knowledge base or a database or something that anyone can come in and look for that information, ask those questions, encourage people to ask questions. One of the things we do here at SmartBear a lot is we say, ask questions. There is no such thing as a bad question. As part of my job, that's what I do all day. I answer your questions so people are not afraid to ask. So part of that is, talking to people, making sure that everyone understands what's going on and also making sure that the processes are the same. You can look at other teams and ask them what they're doing. And then maybe try to align with what they're doing. How does that fit in with what my team is doing? Or what can I do to make sure that that alignment between teams is a little bit better?
[00:04:39] Joe Colantonio Once again, I'm just going to my personal experience. We had, creating a piece of software that relied on another team, creating a third party software that we would then consuming. And then we came into these where discussions like, what do we test? It's already been tested. Are we just testing integration? Are we testing the functionality of the product that's already been tested? Is that part of the process of saying, okay, what are the guardrails we have or what are our responsibilities for consuming this third party in-house app that suppose it was already tested?
[00:05:07] Daniel Lagomarsino Yeah. And I think, even if you have a couple of teams that are testing maybe on different apps for different projects, you still want to be able to communicate with them to make sure that if there is an integration between the two applications, that integration is being tested and met properly because without that you could some bugs could get through. Obviously, bugs cost teams money and you're losing the idea that you're going to be able to release this really great software. For me, when I think about releasing software and if my teams are not aligned and if I don't know what's going on with those other teams, I get a little bit worried when that release comes around, because I may know what's going to happen on my side, but I may not know what's happening on another team. Again, back to communication, back to processes.
[00:05:52] Joe Colantonio And also I've seen it's hard to for executives to know what the heck's going on because each team really isn't in sync with one another. When the release is ready to go, they don't necessarily know the more holistic picture of what the heck's going on. Do you see that as an issue as well?
[00:06:07] Daniel Lagomarsino Yeah I do. Some of the things that I've seen with maybe executive leadership having sort of one way that they want things to go. Those processes are those goals maybe aren't shared entirely across teams. It's really important for not only the executive team but management and individual contributors to all be on the same page here. It helps standardize those processes too, which is what, as an individual contributor, you can sort of established you and you can bring that to your management. You can bring that to other teams. You don't have to necessarily always wait on someone else on the top level, making those decisions. You can make those suggestions and see if everything aligns.
[00:06:48] Joe Colantonio I love that point. A lot of times I've seen the greatest impact on an organization coming from the bottom up rather than the top down. So that's a good, good point.
[00:06:56] Daniel Lagomarsino Yeah. Because when you think about it, they're the ones that are doing the testing. They're the ones that are developing the software. They're the ones that get to see everything. And maybe that you look at a management team or executive team, they may be looking at reports and the reports, just maybe numbers. It doesn't tell the whole story.
[00:07:10] Joe Colantonio Right, right. I guess this is tough. How do you get other testers to care what other teams are testing for? I'm just getting graded on not graded. But when it comes up to promotions and everything. And how did you do on your application. Why do you care about maybe the team down the line how they're testing?
[00:07:28] Daniel Lagomarsino Yeah, Buy ins are really important. When you're trying to align. There are few ways that you could do it. I think what I talk about a lot with my team here is continuously learning. Not only about the tool set, but about the tools and the other teams that you're working with or you know are available there. Because if you can learn, let's say, about a certain toolset that's within your team. And if that tool set is something that as you're scaling teams, you can scale into other teams. It's a really good thing to bring to the table, finding that common ground, right? Finding that thing that all of these teams can share and then build off because the last thing you want to do is you want to say, my team does this better than your team, and then you're creating a competitive environment and that never works. That really very rarely goes well.
[00:08:11] Joe Colantonio That's true. But I've seen where you know what the other team's doing. You're like, oh, I'm actually doing that. And you have like five different teams trying to solve the same problem. And one testing team probably has a really good solution for testing something already. Do you see that as if you're in the silo testing team? Maybe, one of the drawbacks of not communicating is you may be reinventing the wheel that may have already been solved by another testing team to try to solve what you're trying to test right now?
[00:08:35] Daniel Lagomarsino Yeah, yeah, I can definitely see that. You always want to say, what I'm doing is is the right way to do it. But again, it really comes down building that strong foundation between not only your team but the other teams and trying to get buy in. It's not easy. It probably is one of the larger challenges when you're trying to align everyone, just getting everyone on the same page and making sure that, everyone understands that maybe what you're doing is great, but maybe we can improve it. Maybe there's a way to be more efficient. And that is through, using the same tools, using the same processes, or even having this knowledge base where everyone can access.
[00:09:12] Joe Colantonio Nice. Do you also see, you work for a company that actually speaks with a lot of their clients, a lot of different industries, a lot of different verticals. You've probably seen it all. Do you have any examples of maybe how a project may have suffered due to siloed testing teams, or how improving the alignment of the teams helped the product or anything like that?
[00:09:31] Daniel Lagomarsino Yeah, I think so. And without naming names. At SmartBear, we work with, as you mentioned, a number of different teams, a number of different verticals. And we work with smart large companies and smaller companies and some of the larger companies that we deal with, I will work with maybe 1 or 2 teams that are focused on automation, on testing, for example, and showing them the tools that we have at SmartBear, they would then take those tools and they would implement those tools. But the issue there that they would have is maybe they have other testing teams out there that aren't using the same products, aren't using the same tools. Nothing's standardized. And that's really what I hear a lot. We don't have any standardization inside of our organization. We have X amount of teams, they're all kind of doing the wrong thing. But if we can, if we can get my team on board with one of your tools, it's successful. We can expand it. It's always kind of with larger teams, it's always we got to start small, we'd like to start small with one team and then we can expand it out. And I think it's a really good idea for larger teams. And it's something that I talk about with these teams quite a bit. And I think it's kind of the way to do it because let's say you do have a large amount of teams. It's goes from being able to handle teaching teams and this process and these tools. And then you're going to these larger rooms where you're lecturing everyone and just talking to people. And that really never really works out as well as you think.
[00:10:55] Joe Colantonio How do you get buy in then for people to do on a more standards, do you like a proof of concept? Say, hey, we did this for this team. The now able to do this quicker or faster or like what is the benefit for that, I guess?
[00:11:07] Daniel Lagomarsino Yeah. During POCs for example, we will talk to teams and then dependent upon the industry, I can bring in certain examples of other organizations that are using our tools, for example, to speed up testing. What typically what I'll talk to teams about, especially if they're manual testers going into automation, is I will ask them how long does it take for you to manually test? How many tests you're running. And then sort of transition that and show them how automation and standardization can speed up testing? The idea where I can simply say, yeah, if you're testing takes three days manually, what if you use an automation tool and I can show you that it's going to take six hours? Those type of things, those are statistics. Those are things that people want to see. And those are the AHA moments when talking to teams like that.
[00:11:54] Joe Colantonio Awesome. This is the DevOps Toolchain Podcast. I'm just curious to know eventually people need to run these in a pipeline, check in and check out. I would take the DevOps even yet probably another siloed team. What do you see as DevOps more of a culture thing? Or is it actually a separate team that all these testing teams need to then configure? Get their testing somehow integrated into the pipeline in a way that's going to not break it for all the other teams that are using that same pipeline type deal?
[00:12:21] Daniel Lagomarsino I've seen a little of both. I've seen teams that just create their chest, and they'll hand it off to the pipeline to the dev teams and let them run the tests. I've seen teams where DevOps is involved in testing, but for the most part, I would say, DevOps is definitely separate from testing teams. It's just sort of the way that the culture is going or the way that it's been. But now, when you think about the tools that are available for teams that are being easier to use, easy to implement more DevOps teams are getting involved in the testing side, which is really great, because when you think about, you're running your pipelines, you're getting your results. Where do those results go? They go back to the DevOps team. And what did they do with them? If they don't communicate with the testing team and say, hey, there's some errors here, how do you know? And that's really what? Standardization, aligning the same tools and resources would help.
[00:13:13] Joe Colantonio Do you see the DevOps organization or team being an accelerant to say, hey, here's what we've seen work for other teams. Maybe you should try this approach as well?
[00:13:21] Daniel Lagomarsino Yeah, they certainly could be sort of a champion in those sort of situations, especially when you're dealing with teams that maybe they're reluctant to change. Maybe they're reluctant to try new things.
[00:13:30] Joe Colantonio That's a good point. I see champions as being a way maybe to feel very possibly siloed in a large team, getting more, more, more of the team’s organizers having like a champion saying, hey, we should do this based on I've seen here probably.
[00:13:43] Daniel Lagomarsino Yeah. And the great thing is, again, as I mentioned before, it doesn't have to be management. It doesn't have to be leadership. It can just be someone on the team that has this really great ideas, has used this tool maybe in the past and wants to bring it into what they're doing now.
[00:13:55] Joe Colantonio Love it, love it. I guess once someone has some sort of process in place, a lot of times people only work for the sprint, almost like it works for the sprint to get it out the door. How do you help teams organize to plan not only for now, but also for the future?
[00:14:12] Daniel Lagomarsino That's a really good question. Some of it's going to be clarifying, when those sprints are happening, the roles, the responsibilities of everyone. I think, there is this idea that some teams, they like to specialize. They like to have everyone in that team specialize in something. But with cross-training, you can sort of break down those barriers and make sure that as you're running those sprints, that everyone is on the same page because everyone understands what everyone else is doing. It's sort of a cross testing or cross training sort of situation where we're not expecting QA to go out and actually, fix the bugs in the application. But we are expecting the QA to be involved in finding the bugs, relaying that to the developers, and vice versa. We want the developers to understand that those bugs, they need to let the QA team know what's going on. And maybe without the proper tools and responsibilities, that may not happen. Making sure everyone knows what their roles are during the sprint or even after as well, but it's going to make things a lot easier for everyone. And then you can what you can do is maybe if you're integrating or making some changes, you could do it from sprint to sprint, do it incrementally so teams aren't overwhelmed when they're being asked to maybe change processes, change roles, or even align with other organizations.
[00:15:30] Joe Colantonio I love setting up responsibilities. A lot of times I've worked with a company, had 89 sprint teams, and if something failed the test, I don't know who was responsible for it to triage. It's almost like if you have that in place to begin with, that would go a long way with helping breaking down these silos, I would think.
[00:15:47] Daniel Lagomarsino Yeah, absolutely. And then you're dealing, obviously, with siloed teams, you could be dealing with some resource strain too, trying to figure out like where let's say, you're trying to buy software and you need to know how many teams need the software. But maybe you don't know. You don't know these other what these other teams are up to. You're going to buy the software. You had X amount of licenses. And then all of a sudden you realize, oh, we don't have enough licenses, but our budgets blown up. Now what do we do?
[00:16:11] Joe Colantonio How do you overcome that? I could see that happening a lot. Someone buys a product, send a product like, hey, this is great. And all sudden. Oh, we need more licenses. It starts adding up. Is it then, how do you get past at that point, though, if they don't have a budget?
[00:16:25] Daniel Lagomarsino They feel like that's a sales question.
[00:16:26] Joe Colantonio Yeah. Yeah, for sure.
[00:16:28] Daniel Lagomarsino That's a really good question. And then especially in today's economic headwinds, it's a little bit harder to, I think pull more money out of budgets and ask for more money. But it's always worth the ask. It's always worth putting in that request. And that's the nice thing about, if you're a champion and if you're showing success, you're saying, yeah, we have these licenses, everything's going great. We can expand to these other teams and show efficiency. Show improvement, but it's going to cost us a little more money. And the ROI is there in automation. The more you automate, the more return on your investment.
[00:17:00] Joe Colantonio Here's an older debate. I don't know how you feel about it is I love vendor tools. I hate reinventing the wheel, but I seen organizations say, oh, we just do open source and there's no cost with open source, it's the kind of what they say, and obviously there's a cost to everything. Do you see one or the other, or do you see vendors working with open source? How do you see it now, how people view vendor based solutions? And do they actually see now open sources as actually having a cost, like a hidden cost associated with it?
[00:17:28] Daniel Lagomarsino Sure. Yeah. An open source is, it's great. It does have its benefits. But I think one of the drawbacks of open source is training. Let's say, for example, you're using Selenium, and not everyone knows how to use Selenium but using an automation tool. Low-code, no code. Easy enough. Anyone can pick it up. And here at SmartBear, We are open source champions. We do have several open source tools, but I think for the most part, a lot of teams that I'm talking to, let's say they're coming into automation. They may have tried the open source route and for whatever reason it didn't work. It could be, they don't have the technical personas to handle writing those open source codes, for example, or whoever was creating those tests, for example, they left and they don't have anyone else that can come in and edit those test cases as well.
[00:18:13] Joe Colantonio And once again, great point. I guess is maybe goes more towards enterprises where, they have a lot of enterprise software that's hard to automate and they may go will use open source and not realize, this other tool may have been a better fit because now you need to test, still test desktop apps or SAP apps or even mainframes still, when I worked in insurance. That's another point I probably have to take into consideration.
[00:18:37] Daniel Lagomarsino Absolutely.
[00:18:39] Joe Colantonio I guess, team siloing is that normal then? Then obviously, you're going to be siloed. Then how do you make sure that even if it is siloed in the sense that they're working on this team, that they are still able to be part of the bigger, larger vision?
[00:18:54] Daniel Lagomarsino It's a really good question. I mean, obviously, you need to build those strong foundations on those teams dealing with obviously promoting that knowledge sharing. But also, at the same time what you want to do is you really want to almost preserve team identity. And then team identity is a lot different than being siloed. You have cultures and each team and everyone kind of works differently. For example, like my team, we've been together for I think at least 3 to 4 years. Our culture, our identity is very strong. We work very well together. And because of that, we're able to go out into other teams and kind of tell them, how we're doing, how we get together and how we work well together? And really it comes down to like regular feedback. And making sure that when you're dealing with alignment and when you're trying to balance like that specialization as well, you want to be able to promote like cross team collaboration as often as possible. And it could be something as simple as just saying, you know what? We're going to have a meeting with these other teams and just talk. It doesn't necessarily have to be about the business. It can be anything and everything, and building those relationships is really going to help you to expand alignment. Make sure everyone's on the same page because honestly, if everyone likes each other, everyone's going to work better together.
[00:20:05] Joe Colantonio Yeah. I was gonna ask how you actually do that because you have a model that works. How do you get that message out? So you just sub meetings while the teams say, this is what's working with us. We want to see what's working with you. Is it exchange or do you have the more formal like maybe after a big release you have all the teams get together and see what worked and what didn't work?
[00:20:21] Daniel Lagomarsino Yeah. So typically, those start for example, my team will ask a lot of questions. We want to know what you're up to, how many people you have testing for example, what tools you're using. And really the big question that I ask a lot to teams is how is everything working out for you? And if you were to be able to do something better, what would that be? I like to call it. I'm like, I do a lot of software demos or when I talk to teams, I like to say, tell me about your present state. And then when they tell me about it, I ask them, what would your future state look like? And I would sort of caveat with, in a perfect world, what's your perfect future state, knowing that that's probably not attainable. But just tell me, like where you'd like to be. And they would kind of talk to me about that and say, we'd like to be fully automated. Some would say 90%, really 80% is really kind of think where you want to be. But it's nice to just to listen to them and tell them, when they have these ideas, when they have these maybe guidelines that they need to follow, we can tell them, we can sit down some, yeah, there's success there or there's not success there. Or have you tried this, have you tried that? It's sort of always evolving. It's very dynamic. And it really kind of depends on the team we're talking to in the industry and you know what they're looking to get out of for example, automation.
[00:21:33] Joe Colantonio This may be a bad practice now. But when I start a lot of of teams had centers of excellence.
[00:21:39] Daniel Lagomarsino Yes.
[00:21:40] Joe Colantonio COE team that sit on top and was able to do all this and spread it and have like champions going to each of the sprint teams. I don't think it's in favor any more now. Maybe I'm wrong. Maybe it's back. Do you see that as a bad practice in that? Like why am I not? Maybe it's just because biased or somehow of what teams I work with, I don't see COE as much as we used to. And I thought there were a lot of benefits to having a good COE in place.
[00:22:04] Daniel Lagomarsino Yeah, it's actually funny you mentioned that because I was thinking about that this morning that I haven't actually talked to anyone in, I would say 12 to 18 months about their centers of excellence. It just doesn't seem to come up anymore. I don't know if they don't exist anymore or if they're lying low, but yeah, it's for a while. Like I said, 18, 12, 18 months ago, it was something that we would focus on. We'd ask team. Joe, tell me about your center of excellence. And the most part, a lot of teams, they would say, it's small, we're just starting out or we don't have one. I think for the teams that I've been talking to, it may be something new to them. Like for example, trying to align all these teams like there's so much that they're already established in that introducing new ideas may take a little bit of time.
[00:22:49] Joe Colantonio I guess another point of view I might have is I speak to a lot of testers that are experts, so I assume every company has their stuff together, has automation in place. But once again, since you do probably speak with a lot of different companies at different organizations, is that true to is automation now the standard or is it still something you need to teach before you can get to a center of excellence because you need to start at square one sometimes.
[00:23:12] Daniel Lagomarsino Yeah, really good question. I don't know if I have the right answer for that. I do think that a lot of teams that I talked to are new to automation, especially with the tools that I support. But obviously, I've talked to teams that that are fully in automation and maybe using other tools. They want to see what else is there on the market on the landscape. I'm not really sure. It may again, as I mentioned, may be the size of the team, the size of the organization, what resources they have available to build out a COE.
[00:23:39] Joe Colantonio All right. Is AI going to fix everything. I have a news show and I guess once SmartBear released something called Halo.ai, could that help break down silos. Does it improve testing. How does that all work?
[00:23:51] Daniel Lagomarsino And the thing about AI, we talk about this a lot of smart brand. And what we talk about is our Halo AI. It's really our underlying AI capabilities. What it is? It's not meant to replace testers. It's just meant, we get this is the word like the buzzword here empower teams. That's really what we're doing with AI. We're injecting it into our products to help developers, to help testers build out these tests that are going to be more resilient, less flaky, and a lot easier to not only execute, but say, for example, identify objects. For example, we do have what we call our Test Hub right now, which is a group of our automated testing tools that we have injected Halo.ai. And so what we can do is, we could do things like we can have simpler object identification and detection. One of the more difficult things that I talk to teams about is object identification and automation. How is the tool identifying objects and what if the object fails? What if we can't find it? Halo.ai can replace that because it can make it a little bit easier to identify objects. We also have advanced visual regression testing. It mimics the human eye to make sure that we're eliminating false positives. And we have a few other things coming to, it's not something that we just made up a word and we said, yep, everything's AI. We're done. We're continuing to build out AI capabilities within our tools. And we're honestly really excited about it.
[00:25:19] Joe Colantonio Yeah. Very cool. Do you have any success stories? How would maybe AI augmented capabilities help the testing team? Once again, without naming names or companies or anything like that?
[00:25:28] Daniel Lagomarsino Yeah, I think 1 or 2. Dealing with larger corporations that are using enterprise software like SAP or Oracle, for example, we found some challenges within those applications to identify objects. Some of them are flaky, some of the objects are dynamic, and they just change a lot. And it's a little bit difficult. Some of our AI capabilities within, say, for example, test complete, have been able to help identify objects. If they change, they're able to notify the tester that the object under test has actually changed the identifiable properties. And do we want the tool to fix those changes? What we've seen there is, when teams are testing, they're not running into these roadblocks with this object changing with the ERP, updating the software and changing everything. The really nice thing about that is we're using that AI to help us not only identify the objects, but let us know if something's changing. Let us know if something's taking a little bit longer to load or identify as well.
[00:26:25] Joe Colantonio I must sound like a grumpy old man once again. Once I sat off with vendor tools. And they were tightly coupled where I knew if I wrote this test enough to this requirement. And then in the end, I had a nice rapport. I had everything all in one place. It was beautiful. Now with open source, you're like, it's like copy and pasting and trying to get things to talk to one another. And I think a lot of companies miss out on having like this was like a centralized hub. Do you have anything similar to that like I believe once again, I've been following SmartBear forever. I saw something about Test Hub. Is that like a tool that could help if something like that is similar, that can help manage, automate, and execute all your tests in one place to make it easier for teams to be able to actually have insight into that. What's going on?
[00:27:05] Daniel Lagomarsino Yeah. The test hub is actually a handful of our SmartBear products. We have an automated functional testing tool called TestComplete. And we have a group of tools that integrate with it. For example, Zephyr Enterprise, which is for test management. We have BitBar, which is our cloud based device farm, Load Ninja, which is our UI performance testing tool. And finally we have Visual Test, which is our automated visual testing tool. What we're doing here at SmartBear is we're incorporating all of these integrations, we're building out and also adding the AI side too, to make sure that everyone is able to use the same set of tools here because typically, with vendor solutions, there's an automated testing tool, but also if you want a test management tool that's separate. And we need to talk, get someone else to talk about that. But now what we're able to do is we're able to present these tools as one unified solution. That way, if you do need to manage, automate and execute all your tests in one place, there's a really easy way to do that. With tests complete, you're creating your test cases. Let's say, for example, against, maybe on mobile device and you're running it through BitBar. You're making sure your application is working in Test Complete with BitBar, you're making sure your application works across all devices. And with visual test, for example, you're making sure your application looks good. And of course, using a tool like Load Ninja, you're making sure that your application is performing properly. All of these things are not can now be done from one Hub. And that is you TestComplete which is really great. And really what it's doing, right? We're saying, we're getting rid of blind spots right. We're making sure that we're covering what types of testing. And we're able to pinpoint those problems easily, which is going to allow us to scale easier. And then, of course, comes back to just confidence in your releases. You don't want to be scared when you go to release the software that it's not going to work, that there are bugs. You want to make sure that you're testing is where it needs to be, and the application is the best it can possibly be. And then, we're fortifying it with AI to make sure that we're able to grab those processes, make sure that we're eliminating, like the error prone processes that maybe would take up time if we're doing manual testing. We've got a lot going on here at SmartBear.
[00:29:12] Joe Colantonio A quick other great point is, maybe a team is siloed and they're doing functional testing. And now they have a new story that has a performance criteria. And like now what? But you mentioned like Test Hub has tooling like, Load Ninja to help with that. So would that help siloed teams then breakdown maybe having a separate team that just does performance testing to now open it up to more testers across the organization?
[00:29:36] Daniel Lagomarsino Yeah, I think it could. And especially when you think about managing test cases, for example, you take a tool like Zephyr that is simply going to store the test case results. For example, you can have, multiple teams using Zephyr so everyone can see what everyone's doing in their releases, which is really nice because let's say, you have those teams are working on Zephyr applications, but maybe there's an integration there. And you're not really sure if you're on one team, what's going on with the other team. But if you're too afraid to talk to them, maybe embarrassed, you can always use the test management tool to go in there and get the information that you need. It's really nice that even with siloed teams, to get them on the same toolset with these tools, they don't necessarily need to use all of them. But maybe if 1 or 2 people in each team uses some of them, nd they can communicate back and forth, and that's going to create more alignment within these teams.
[00:30:26] Joe Colantonio So then you mentioned before, you're doing a lot with AI now. And you said there's even more, but you didn't give us anything. Can you could give us a little red meat or a little teaser of maybe what's on the roadmap, what you see coming?
[00:30:36] Daniel Lagomarsino Yeah.
[00:30:37] Joe Colantonio Possibly.
[00:30:37] Daniel Lagomarsino Sure.
[00:30:38] Joe Colantonio No promises, but.
[00:30:40] Daniel Lagomarsino Yeah. Absolutely. So let's take a tool like TestComplete, for example. What we're about to launch in beta for TestComplete is AI test data generation. What this is going to do is if you do have that requirement to drive data through your application and maybe you don't want to use a database table or an Excel spreadsheet, you can use the AI capabilities inside of TestComplete to simply say, fill out these forms with realistic values, for example, and Test Complete will read that and be able to fill out those forms as needed. That's just one little thing. We also have some AI powered API contract testing that's coming. This is going to help users contribute to delivering obviously more powerful AI here. And then, when we talk about one of our other tools called Reflect, we have a true codeless test automation tool. And with Reflect we can leverage AI to not only identify objects, but we can also use because it does actually integrate with OpenAI. And what we're able to do there is use AI prompts to essentially ask the application and ask Reflect to do certain things just based on free form English.
[00:31:48] Joe Colantonio Very cool. Okay, Dan, before we go, is there one piece of actual advice you can give to someone to help them with their siloed testing DevOps efforts? And what's the best way to find or contact you?
[00:31:58] Daniel Lagomarsino The best advice I can give them is going back to the beginning, communicate, talk. You hear this not only in silo testing. You hear this everywhere. Communication is the key to any successful relationship. Whether we're managing our own relationships or customer relationships or within the organization. Talk, understand what everyone else is doing. Be empathetic. Because we talk about being empathetic in software testing quite a bit, especially here at software understand, what the other teams are doing, what your own team is doing, and find a way to bridge that gap. Figure out what works, what doesn't. And try if you can to come up with a plan. And as far as you know, reach out to me, I can be found on LinkedIn or Smart Bear. I think that's probably the only two. Like, I think LinkedIn is one of the only social media tools I use these days. But yeah.
[00:32:49] And for links of everything of value, we covered in this DevOps toolchain show. Head on over to TestGuild.com/p157. That's it for this episode of the DevOps Toolchain show, I'm Joe. My mission is to help you succeed in creating end-to-end full-stack DevOps toolchain awesomeness. As always, test everything and keep the good. Cheers
[00:33:11] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.