Redefining Test Automation with Dave Piacente

By Test Guild
  • Share:
Join the Guild for FREE
Dave Piacente TestGuild_AutomationFeature-JOE-Only

About This Episode:

In this episode, Dave Piacente, a senior community manager in developer relations and community expert at Applitools, joins us to talk about redefining test automation.

There are a common set of techniques seasoned test automation practitioners know to be the pillars of any successful automated testing practice that can fit into most (if not all) team contexts.

But some things have either been left out or are described in ways that are disadvantageous for the industry simply because we need to talk about it from the perspective of the fundamentals used to craft them.

By reasoning from first principles, we can unearth a more impactful definition of test automation that can act as a compass and help supercharge everyone's test automation practice – while also understanding how to navigate the uncharted waters of new technologies that challenge existing paradigms.

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Joe Colantonio

Joe Podcast Full Smile Shot

Hi. I’m Joe Colantonio, founder of TestGuild – a dedicated independent resource of actionable & real-world technical advice (blog, video tutorials, podcasts, and online conferences) to help improve your DevOps automation, performance, and security testing efforts.

Connect with Joe Colantonio

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:00] Joe Colantonio It's been a crazy few months of me running two major online events, Automation Guild 2024 and RoboCon. I've been heads down all year, and I finally have some time now to focus on other things. One of them I missed from being too busy is that this podcast, The Test Guild automation podcast previously known as Test Talks, officially turned 10 years old on February 18th. So I really want to give a big thank you to everyone who's been listening and supporting the show over the years. I really do appreciate every one of you. And I think now this is not only the original podcast dedicated 100% automation testing, but now also is the longest-running one. And I couldn't do it without you. And today, I want to share with you a session we had at this year's Automation Guild. It's from Applitools Dave Piacente all about redefining test automation. Dave was actually the second guest ever to be interviewed for this podcast, way back in March of 2014. So I thought it'd be fitting to highlight his session from this year's Automation Guild to help celebrate 10 years of automation awesomeness. Before we get to his session, did you know you could still get instant access to all the recordings that took place at the February event? And you can also get in on the monthly training sessions we're going to have up until November if you join now. And for being a loyal listener and to celebrate 10 years of automation awesomeness, use the code, CELEBRATE10YEARS and get 33% off your Instant Access pass. Grab that ticket and you also get access to our private community as well. Hope to see you in there!

[00:01:32] Dave Piacente Hello and welcome to my talk on redefining test Automation. I'm Dave Piacente. So let's get started. I think before I get too far, it's worth talking about my story just real briefly. For those of you that may not know me, I've been in test automation since around 2009, either attending or speaking at conferences and meetups. And I did that until about five years ago. I took a job as a software engineer, working Applitools in the research and development team. And after a couple of years, I realized that I wanted to focus more on just being heads down as an engineer. And I took a break from being in the public spaces and the public speaking, and still within test automation, working large and building, helping build products Applitools. So I was still well-versed in what's going on and the industry was not really paying attention to the talks that were going on. And then more recently, I moved into out of R&D and engineering specifically and into this new role where I focus on doing developer advocacy and community-focused activities. And so I started more recently paying attention and getting up to speed on what people are talking about again, in the industry. I figured, well, it's been some time since I've been doing this. I'll step in and it'll be like jumping ahead light years, right? Five years. It's a long time. And what I realized is that, we're still largely as an industry talking about the same things, and I was a bit surprised by that. And so I thought, well, maybe that could be a good opportunity to talk about where I was thinking. And so I figured in that vein of the word opportunity, it's worth mentioning that I think there is actually a big opportunity, and that's one for us as an industry and as a large community of other practitioners to collectively and collaboratively and clearly define what we think test automation is as an industry because where I'm coming from is this, there's a lot of problems that have been solved. And I think that there are still people discussing nuances to the solutions or new facets of the solutions. And I think that's great. That's fine. And I don't mean to disparage anyone's prior work. What I do think, though, is there is an opportunity for us to level set across the industry and almost like a standards-based way that hey, like as practitioners like this is the fundamental baseline that we agree people who are widely successful in test automation do these things. And they don't have to be specific to context or vendor. I think there's a way to do it in a vendor-agnostic way, in a context, largely context-free way, where we can actually come up with a way to distill all of that crystallize it put it into something. And so that's kind of the main thrust or motivation behind this talk in my perspective. And I think that there is some prior work that we can look to to help maybe put some guardrails up to help make the path a little more clear that we can kind of pull from to show us a way forward and kind of help them spell out a bit of what I'm thinking. So some examples. And these are a bit disparate, and maybe don't make sense, but they will I think, in time. There's this person, Coach John Wooden. I'm going to explain these after I mention what they are. There's also this book called A Pattern Language. There's something most of us have likely heard of called the Agile Manifesto. And then there's kind of an offshoot which is less known by some, I'm assuming, called the Software Craftsmanship Manifesto.

[00:05:40] Dave Piacente Let's step through this. Coach John Wooden. So there's this link to a New York Times article. But basically, there's a lot that's been written and talked about with Coach John Wooden. He's a really well-known, at least in America, at least for people who care about college sports. Basketball coach. And over his time, coaching at UCLA. He had won 10 championships in 12 years coaching this team. And he has a whole canyon of content, and beliefs around this idea of building like a pyramid of success and how you have to start at the bottom to build the foundation, and then you can go up from there. And the most often cited the peace of that. The whole concept that this talked about, this written about even anecdotally, a third part of what he's talked about is the concept of tying your shoes. Really starting from the very, very, very basics of like, okay, he's working with a new player for a team. They're putting on their shoes. What's the first thing they should do? They should tie up their laces. And then he goes in this article they actually talk about different quotes of his words, talking about the different tightness of the laces, saying A little tight? Okay, then maybe you won't get so many blisters so you won't miss a couple of games. You'll be able to have a little more control and traction on the court. Make it even tighter. Well, maybe now, you won't sprained ankle and then you won't be out of the game for a couple of weeks. And so just thinking about that in the context of test automation, as we're getting into this conversation of like, let's articulate what is a baseline, I think starting from kind of fundamentals is really interesting. And I just want you to put that plant that seed fundamentals.

[00:07:26] Dave Piacente Now, the next example of pattern language to borrow from Wikipedia. Here's a quick blurb. It says the book creates a new language. What the authors call a pattern language, which is derived from timeless entities called patterns. They describe a problem and then offer a solution, and in doing so, the authors intend to give ordinary people, not only professionals, a way to work with their neighbors to improve a town and neighborhood, design a house for themselves, work with colleagues to design an office, workshop, public buildings and such as the school, and etc. and I think this is a fascinating concept where creating an approachable language to describe the building blocks that go into something that's very technical, that typically requires a very specialized person, and then enabling it so you can collaborate with others around you who are not in that same profession to design a common space or a common outcome. I think that's also interesting as we talk about trying to articulate and define what we want as practitioners. Now, let's talk about something we probably all know, the Agile Manifesto. We flashback 23 years ago on a cold February Sunday, with kicked off the beginning of a few days. 17 software engineering practitioners met at a ski lodge in the mountains of Utah to find common ground about what mattered to them when it comes to approaching their work, and the byproduct are these values Agile Manifesto and is largely fueled, a huge chunk of our industry, if not nearly all of it. Especially those that focus on iterative software development. So since then. And so you've just step through these real briefly here. The values being individuals and interactions over processes and tools. Working software over comprehensive documentation. Customer collaboration over contract negotiation and responding to change over following a plan. And while there is value in the items on the right, we value the items the left more. I think it's fascinating. A great way to frame it. I don't know if this is exactly what I have in mind, but I think there are pieces of this that are fascinating and helpful to take with us. As we talk about something that I'm thinking about for creating a better definition. Now, if we look at an offshoot of the Agile Manifesto, we have something that came in a handful of years later. I think it was around like 2009 maybe. It's the manifesto for software craftsmanship and the motivation behind this is to help create a movement around quality and professionalism for software developers. It's based on four principles, which focus on a mixture of software quality and people, both developers and customers. And so, it says not only working software, but also a well-crafted software, not only responding to change, but also steadily adding value. Not only individuals in your actions but also a community of professionals. Not only customer collaboration but also productive partnership. So, similar structure to the Agile Manifesto. There's just a handful and it's a focus on one side versus the other. That is in pursuit of the items on the left. We have found the items in the right to be indispensable. So they do the same, same format but flipped it.

[00:10:50] Dave Piacente Now, I think that both of these are great. It's a great starting point. I think, focusing on value is super important. But what I'm also articulating hopefully here and coming to a conclusion and hopefully inspiring you all is to maybe do something like this. But with some additional things. So we're going to, Yes and this and so I think that with that in mind, let's help create a perspective hopefully. Hopefully, all of these elements combined if you put these in your mind. And treat them with as inspiration and a lens to look through. I think it can help inform a path forward for us. And so I like the idea of a manifesto, but I do want something more concrete and actionable. But let's think about this and how it can be applied to test automation. So really the outcome that I'm looking for, starting with the end in mind, I want to start from the ground up in terms of the things we care about. And so I want to find a common language that we can all agree on and describe the foundational elements that lead to success in test automation. So not just values, but also more general attributes in the practices that we all have for wildly successful teams. And I want to write them down into a living document. And so some examples, at least on the value front to get started. I think we should focus on business values or business value rather over test coverage. So the idea of just focusing on 100% test coverage, or 100% browser and device coverage when you lose sight of why you're doing it. It's really focusing on delivering business value. And I think being the conscience of the team over sticking with the status quo just because things have been done a certain way, like we should always look for ways to try to work smarter, reinvent ourselves, and not just work hard just to get something done because this is the way it's been done, or just because somebody says it needs to be done this way. Understanding the underlying motivation and trying to be as effective and efficient at the same time as possible, I think is a bit of a superpower that we have. If we can focus on being the conscience of the team. And then check the things work for customers instead of works on my machine and really focusing on. It's like you can run tests headlessly. That's great. But customers don't run things headlessly. And like, there's kind of this. That's just one example. But that conversation where you want to try as close as you can to have parity with the experience that customers have to make sure that you are actually checking things, in a way that really gets you as close as possible while balancing efficiency and cost. And then, there's no comparison point here, just fast and reliable feedback. Full stop. I think that that's really ultimately, at the end of the day, that's what we're trying to do. We're trying to balance the idea of being technical practitioners, while also being like business analysts in trying to offer this efficient balance of those two perspectives and creating feedback loops so that we can inform all the relevant stakeholders to make sure that they're aware, of when something does work, when something does work, and trying to build high trust, but build it in a way that's like creating a system or set of systems that make people stay informed about the health of what's actually being built, both in terms of code and isolation and then applications in flight, all interacting and working together. And so in terms of zooming in a bit, some examples to get the conversation started around more general attributes. I think this is where the conversation starts to get a bit more interesting. Some examples, team in culture. And by that I mean skills, commitment to quality, and testability to pick the right tools for the job. And robust test framework, and simple authoring, align with team skills that is maintainable and reusable. And stable and scalable test infrastructure. Fast execution at scale executed across browsers in devices with stability baked in reporting and analytics, easily identified flaky tests, see results in real-time, efficient selection and prioritization of tests and test data in isolation. So the ability to mock services systems, generate test data, and create deterministic outcomes with minimum dependencies between tests and systems. Now, I think we can hopefully as you hear this list, think. Oh. Are these all the tenants of what I do in my team or if you think of teams another practitioner that do things really well? I'm trying to think in terms of fundamental pillars that like prop up the success of a team. And in a way this done again vendor agnostic, largely context-free. And my experience for the teams that I have consulted with colleagues, I've talked with customers, I've interviewed for Applitools. The list goes on. I think that these are fundamental like table stakes. Things that everybody ultimately practices and has at their disposal, which are not to say that they're easy or necessarily even fungible, but I think that they are true. It's like where team and culture are probably the hardest one to get right. It's like getting the right people, the right motivation, right mindset. Buy in from leadership, funding, and support for either education and focusing on open source or paying for licenses for commercial off the shelf tools or some combination of the two. I think that's one. And that takes time to grow and nurture. But I think the teams that have done the most wildly successful things with test automation, that's why I put that one first. The other stuff. Also still relevant and necessary. Some kind of robust test framework like you can't do test automation if you don't have a means for making it so people can offer tests simply that actually show up and meet them where they are, creating building blocks for tooling that are ideally in the language, that they're in or making it. So if it's not something they're familiar with that there's good, on ramps for them, in a way that's also maintainable and reusable, so easy for people to keep working and also solve the problem once and not have to have everyone else solve it as well. And then test infrastructure being stable and scalable, I mean, wildly important. The ability to scale horizontally and vertically, horizontally, being you want to have as fast as possible, so you scale out to across as many browser devices as possible, vertically being across different platforms, operating systems, browsers, and devices. And then, test flakiness being like the most violent and vane of our existence with the ability as we're seeing with new platforms emerging, the ability to bake in, some reliability. So self-healing locators being one fantastic example of that. And then reporting in analytics, I mean, being able to easily identify and flaky tests and potentially even like quarantine them out to triage them so that they don't, introduce doubt and have the freight trust in your solution. Amazing. Seeing results in real-time, with the ability to efficiently select and prioritize tests. And understand what's going on at a glance. Also super helpful. And then being able to of course, when something goes wrong, being able to dive in and understand what's going on and then test data in isolation, largely deserves its own because it's such a massive topic. But the ability to do what you need to do, to simulate a situation. If you catch a bug and you understand that it's like this orchestration between different services or specific data, then you can change the shape as needed and try to really introspect into what you need, where to test different scenarios that a customer run through, by generating test data. And then being able to also test in isolation so that you can actually really hit the other mark of need, the scalable test infrastructure because test isolation is really important if you're going to actually get into scale and test execution. And so, this is just my list, but I really want to know what's on yours. So I would love it if we could start the conversation, at least in the simplest, lovely format, which is to email me. So if you send me an email to the subject Redefining Test Automation to Dave.Piacente, it's and let me know what's on your list. That'd be great.

[00:19:58] Dave Piacente I think that it's worth talking about Clarity. So I think when thinking about things in a very specific way around test automation. Around what we value and what are the attributes that are important that we've seen for teams that are successful. And if we all distill that down into a common definition that we largely agree on. I think that that gives us clarity around the foundation of what our practice is. So it becomes a lot easier for us to assess the value and the efficacy of a new technology. And so I want to speak about that briefly in the context of generative AI or anything new that comes potentially after it. And so, I think that it's worth asking some questions about the new technology, to try to keep the conversation grounded, and realistic. And so first, the simplest one is always does it solve a real meaningful problem with new technology, or is it just something that everyone's talking about and there's a lot of hype and really high expectations, but then when you actually get to seeing and using the technology, does reality have a mismatch with the hype? But really putting that aside that it's very easy to get wound up and excited about the hype or anything new, even if there's no hype. Just understanding, like what are the core tenants and like, what is it this important? What are your problems? Does it solve it in a meaningful way? Does it change the way that you work? Is it new technology that fundamentally change the paradigm of how you work? Are there certain things on that list that we talked about and a couple slides back that maybe you can do a lot faster, or maybe you don't even have to do anymore? Well, then that's super interesting. Now, but think about it from the perspective of the side effects and ripple effects from it. Does it create new problems? It's worth considering. Just because it makes things immensely faster. Maybe it creates a lot more work product that requires a lot more insight to be able to quickly discern. Maybe it makes one thing faster and another thing much slower. It's hard to know. Does it work securely and reliably? If it's a public LLM, you're not going to want to dump your secure domain-specific business secret stuff into it. And does it work reliably? We've seen hallucinations with ChatGPT, for instance. And these things get better over time. I think over time, maybe we'll see it where private LLMs be easier to stand up, to be more commoditized, easier to train, and be far more reliable. These things can happen. Again, just questions to ask to understand the context that you're in. Try to keep the conversation grounded. And also when things break, is it clear what steps need to be taken? Do you have the right people, do you have the resources you need? Is it something that you can reason about because if you come to blindly trust the technology once it stops working. What will you do? For instance, quick side quest example. Say you're trying to go somewhere you've never gone before, in a part of town you've never been to. And all of a sudden you have internet and you can no longer use Google Maps and you don't have a GPS dedicated in your car. What do you do? I think that we have become so reliant on some technology that we blindly trust it. And to the service of our own needs when it stops working. And so I think we should take that example in that mindset into assessing these new technologies, just because even if it works and solves problems really well, we need to make sure we understand the risks associated with it. And also just one thing I wanna point out, there's a single Conway's Law that I think is very relevant and salient for this point of conversation, that just because we have new technology and it could fundamentally change, let's wave a magic wand and say it works, say it solves all the problems, makes it so our jobs are fundamentally different and companies adopt it. Now that doesn't solve the people's problem. And the thing that's fascinating is a single Conway's Law and according to Wikipedia as written here, describes the link between the communication structure of organizations to the systems they design. So specifically, organizations that design systems are constrained to produce designs, which are copies of the communication structures of these organizations. To put it in a different way, is to say that the behaviors of the people in organizations, the technology that they come to build and use will mirror those behavioral conversations and it constructs. So just because you magically say, here is this tool that makes it so, we fundamentally had become much like a thousand times more efficient in terms of test automation. And we remove a lot of things that we have to pay attention to. But if it doesn't solve the dynamic between how developers and testers throw things over the wall to each other, there's still going to be this huge friction. And as you meet these huge people problems and laying underneath all of the technology. And so Technology is not going to magically solve everything. There will still be problems. And some of them, a lot of them will largely be people's problems. And so it's just worth putting that into the conversation as well. Just as a side note. Now, I'd like to open up for questions. I'd love to hear from you. Thanks for listening.

[00:25:48] Dave Piacente Thanks again, Dave, for your automation awesome. For the links of everything of value we covered in this episode. Head on over to Test And while you're there, make sure to register for automation guild to get instant access to all the awesomeness that went on at this year's event, and get 33% off by using the code CELEBRATE10YEARS. So that's it for this episode of the Test Automation Podcast. As always, test everything and keep the good. Cheers.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Promotional image for "AI Observability" hosted by Eran Grabiner from TestGuild DevOps Toolchain, supported by SmartBear. Eran Grabiner is shown on the left and another person speaking into a microphone on the right.

AI Observability with Eran Grabiner

Posted on 06/19/2024

About this DevOps Toolchain Episode: Today, we are honored to be in conversation ...

Testguild devops news show.

Browser Conference, OpenSource LLM Testing, Up-skill Test AI, and more TGNS125

Posted on 06/17/2024

About This Episode: What free must attend the vendor agnostic Browser Automation Conference ...

Harpreet Singh-TestGuild_DevOps-Toolchain

DevOps Crime Scenes: Using AI-Driven Failure Diagnostics with Harpreet Singh

Posted on 06/12/2024

About this DevOps Toolchain Episode: Today, we have a special guest, Harpreet Singh, ...