Awesome Software Testing Strategies with Matthew Heusser and Michael Larsen

By Test Guild
  • Share:
Join the Guild for FREE
Promotional graphic for "TestGuild Automation Testing" featuring speakers Michael Larsen on Software Testing Strategies and Matthew Heusser.

About This Episode:

I'm thrilled to have two titans in software testing, Matt Heusser and Michael Larsen, with us today. These veterans, with their wealth of experience and knowledge, are here to discuss their latest contribution to the testing community, their new book, “Software Testing Strategies.”

In today's episode, we will unpack the inspiration behind “Software Testing Strategies,” exploring the trio of testing essentials: skills, strategy, and the nuances of day-to-day operations, including the politics that intertwine with the testing process. The authors will discuss their approach to addressing the complexities of software testing, finding the most effective tests among endless possibilities, and how their book aims to guide you through these challenges.

Matt and Michael will also share critical insights into organizational dynamics, the value of adaptability in the testing realm, and the ethical considerations professionals face in their careers. Plus, we'll touch on the difficult journey of updating outdated systems, navigating the minefield of communication, and why terms like “QA” may need a rethink.

Listeners, you're in for a treat, with real-world stories, practical advice, and invaluable expertise that's just a discount code away – so stay tuned as we dive into the world of “Software Testing Strategies” on the TestGuild Automation Podcast.

About Matthew Heusser

matthew heusser

Matthew Heusser's career spans over three decades, during which he has been at the forefront of software testing leadership. In 2014, he was honored with the Most Influential Agile Professional Person Award at Agile Testing Days in Potsdam, Germany, and the following year, he received the award for the most popular online contributor to Agile at the Agile Awards in London. His company, Excelon Development, made headlines in 2023 as an INC 5000 award recipient, marking it as one of the fastest-growing privately held companies in the United States. Matthew's dedication is evident through all the contributions he has made over the years as a speaker, conference organizer, consultant, and writer.

Connect with Matthew Heusser

About Michael Larsen

michael larsen

Michael Larsen has been working in software testing, one way or another, since 1991. He's worked with various companies and products over the years. He lives in the San Francisco Bay Area and has advocated for quality since launching his TESTHEAD blog in 2010 (https://mkltesthead.com/). He has served as a member of the Board of Directors, President of the Association for Software Testing (AST), and the Marketing Chair for the Pacific Northwest Software Quality Conference (PNSQC). And he is an awesome singer.

Connect with Michael Larsen

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:25] Hey, it's Joe, and welcome to another episode of the Test Guild Automation Podcast, and you're in for a super special treat today. Today, we have Matthew Heusser and Michael Larsen sharing all about their new book, Software Testing Strategies. That's been on the show way back in 2015. And Michael, I've been trying to get him on the show but has been on in a while. I haven't been able to get him on until now, so really excited to have him. As you can tell, these books are ready to fly off the shelves. Definitely after the podcast, you're going to want to pick up a copy and you want to stay all the way to the end, because we're going to have a special discount code for you. If you don't know, Matt's career spans over three decades during what he's been the forefront of software testing leadership. In 2014, he was honored with the Most Influential Agile Test Professional Personal Award at the Agile Testing Days, and the following year, he received the award for the most Popular Online Contributor to Agile at the Agile Awards in London, at his dedication is evident through all the contributions he's made over the years. If you go on YouTube, he has sessions all over the place. he's been a speaker, conference organizer, consultant and writer. And Michael Larsen, same thing. You probably heard of him. He's been around since at least I think, his blog post in 2010, which is Test Head, he's worked with various companies and products over the years. He lives in the Bay Area, and he's always been really into software testing quality and also leadership. He also served as a member of the board of directors and president of the Association of Software Testing, and the marketing chair for the Pacific Northwest Software Quality Conference. And if you don't know, he's an awesome singer. Definitely, maybe we'll have some links that as well. You don't wanna miss this episode. Check it out.

[00:02:09] Hey, guys, welcome to The Guild.

[00:02:13] Michael Larsen Thank you very much.

[00:02:15] Matthew Heusser It's great to be here. Thanks, Joe.

[00:02:17] Joe Colantonio Oh, it's great to have you both. Like I said, Matt, last we spoke was actually in 2015. So it's great to have you back. And, Michael, we just keep missing. I think I've been trying to get you on the show for 10 years, so it's awesome that Matt finally to hold you in. So thank you for that.

[00:02:31] Michael Larsen Oh, good. Glad to finally be here. I know it's been back and forth. I think the last time that you and I directly interacted was when we went all virtual.

[00:02:39] Joe Colantonio But it's great to have you on officially because this is like I said, I've been waiting for this for a while, so thank you. I'm really excited about this episode because I, as you could tell, I love books. And so I was excited to find out when I saw you have a new book out. So today, I just want to focus basically on and obviously wherever we go on it. But on your new book, which is on software testing strategies. Before we get into it, I always ask authors this, if you could pull up that book, let me get a good, look at this book so people can check this out. First question I always ask authors is, why write a book? I know you both are super uber uber busy, so, let's start with you, Matt. Why? Why write a book? I know you've wrote a few before, you edit a few, but why this one?

[00:03:19] Matthew Heusser And short answer is somebody called me and said, we want you to write a book. And that's very different than kind of pushing out, like, I've got a book proposal. Would you please pay attention to me? And, Michael and I have been working together for at least 13 years, and we got a lot of experiences. And you have these experiences got to come out. You end up, you either do a conference talk or you walk on the side of the road waving your arms and talking to yourself. Or maybe you do a blog post, but there's no single place to go for all of the material. The other interesting thing that started to happen was, we would go to conferences and said, I'm the doctor, let me solve your problems. It's very satisfying. But where can people go to learn all this stuff? So someone challenged us to mid-course fine lean software testing course made the course, then coronavirus happened. Or some people don't have a travel budget, so how can we spread it even further? And that's the book. So I think it's about three ideas or three reasons why. I think it's about three reasons why. The timing was finally right after being around for a while that I think we actually had something new to share.

[00:04:40] Michael Larsen 14 years of podcasting. We had come across a whole lot of conversations. We had talked to a bunch of people, and we realized that there were variations on themes, and we realized that there were, I think when we got right down to it, that there were three big areas that if I was going to talk, if we were going to talk about testing, we would probably want to cover this. And so the first area, of course, was your testing skills and automation, of course, fits into that. But there's a lot more to testing than just automation, as we well know. In fact, in some cases I will be blasphemous and say that automation might be one of the easier skills to pick up. And I don't mean that automation is easy. What I mean is that it's fairly cut and dry when you get right down to it, you can program certain things. You can have certain expected responses. Once you have a really good feel for what's happening with something and you know where you're going with it, then you can get into automation. And also, my view of automation is that there's a lot more to it than just firing up your IDE, pulling up something like Playwright or Selenium and writing Assert statements. Anything can be automation. When you're setting up the bulk of your stuff for your environment, all those little niggly bits you got to deal with just to get the app to run in the first place, you realize, oh yeah, I got to do this, I do this, got to make sure this loads, got to make sure I've got this running in the background, all this stuff. And once you realize what those are, you want to make sure that they're running. You throw those in the script file. You throw this in a startup file when you first fire on your computer, everything in place. Fantastic. And you realize that over time, those are the things you're going to always have to do. We're going to talk about this a bit when we come into test strategy. And the phrase that we refer to is recipes. You use a lot of recipes when you want to cook at first, but of course, once you get a lot of that under your belt, you start to play with flavor profiles rather than following a recipe line by line. So that's part of it. So there are skills that you learn. So it's the first section. And the second section is a strategy that you want to utilize. And you want to be able to put this plug those pieces together as you need to and be able to communicate what you're doing. And then the third part was just the actual day to day details of what we do as testers. And we realized through the podcast that a lot of that was what we referred to as the politics of testing. And so by the time that we had looked at all this, we said, there's some interesting areas here, and I don't think that we've really seen a lot of talk about the politics or the political side of testing. And I don't mean like the same way that we look at it from a government standpoint, but the way that we just live in and interact with people as software testers, as software engineering professionals, however you want to fill it in, you're going to get differing reactions from people. Sometimes you're going to have great experiences. Sometimes you're going to have dreadful experiences. And knowing how to interact with those is really important. So there you go. That's a big, long spiel for what got us into doing it.

[00:08:09] Joe Colantonio Awesome. So once again, 16 chapters. It's broken up into three parts. The first one is the practice of software testing, and the second part is testing and software delivery. And the third is practice and politics. So we're going to dive in. I like to dive into each one. Just give people a little taste, a little flavor so they know ooh that's tasty. Like you mentioned very flavorful. So you have to grab the book to actually know how to make the recipe and then make your own flavor profiles. It sounds like. I guess with all that background. You mentioned automation. It's kind of the easier thing, but I assume I'm not saying we're old, but we are older. So we probably already had testing skills or QA skills before we actually got into the tooling. But it seems nowadays people start off with tooling, but I don't know if I'm being a grumpy old man. Is that why you started off with this particular chapter on testing and debugging test, or what's this all about?

[00:08:56] Matthew Heusser One of the things that we can do, which I don't think is particularly difficult to learn, is go into a project that has quality problems and just start testing and finding defects on day one. But we've got reference customers that we come in, it takes six weeks to get up to speed, blah, blah, blah, blah. Okay, well, here's some defects. And quick attacks was a term for one of the ways to do that. It's actually breached our form of testing, but a lot of people don't know it. We're going to start do this right now, you might find some value. And then from, there we've kind of work backwards into here's how to analyze the combinatorics of the system to figure out some more test ideas. It's really interesting because there's a an infinite number of possible tests. In fact, with things like memory leaks and race conditions, you could run the same test five seconds later and it might actually have a defect you didn't have before there's an infinite number. And that's even before you get into the five of these and four of these and seven of those, nine of those, eleven of those, infinite number combinations. What the techniques actually do is they reduce your number of things to do to generate some information about the system. But people tend to think of test techniques as a way to come up with things to do. Either way, we want to come up with very, very powerful things to do to learn information about the system that we're working on. If you don't know what you're going to do, and you certainly don't have a way to find out the most powerful things to do, and you don't have a way to take the information that that comes away with to have knowledge about the system then what the heck is your automation even doing? Like automating what? All right. We're going to click this and we'll get an answer of seven okay. What does that tell us. Why do we pick those things to do. Because they were in the acceptance criteria. Why are they in the acceptance criteria? Once we've run them, what do we know? If you can't answer those questions, you need chapter one. Now, what I would say before we go on, if you're a more senior type and you're getting this book. I've been around, Matt. I've read a book before. Great. Fantastic. We love you. We genuinely believe this pushes the needle forward on software testing, but you probably can go through the table of contents and find the things that are either most interesting to you or new to you. And it'll be a gold mine of value if you want to read it from cover to cover as some really, really Tim Weston, very senior tester, said, no, I'm going to read the whole thing, cover to cover. Fine. But it's going to be more like a coal mine. You're going to get these nuggets that are of some value all the way through. Either way is great.

[00:11:41] Joe Colantonio You mentioned, from a customer's point of view, things they may do and may not do based on what you developers think. And in their book, you have a few chapters that go over like programmer facing testing, and then you have customer facing testing and then specialized testing. Why break out those as different chapters rather than have it all under the umbrella of testing?

[00:12:05] Matthew Heusser We got to bring it out somehow. You want to have chapters in your book, right? But importantly, I think that the ideas that cause you to be successful in one of those arenas will cause you to fail in another. And if you look at, some really wonderful, smart, articulate leaders in the community that seem to have no ability to talk to each other, it's because they have assumed one of those perspectives and then the whole world is like that. So you can get well. I want to put this and this. We do a second, I don't think we talk about the reliability, modeling reliability, if you're thinking about system reliability. If your system is a combination of little pieces and each piece has 80% reliability and a walk through the system, it's going to touch 12 pieces, presupposing you could reliably put a percentage on reliability. It's a model. Your overall reliability is going to be point 8 to the 12 which means it's not going to work. It's not gonna work. Point 8 to the 12 as it's going to be like, it's not gonna work. So you need to gradually improve that. And one way to do that is with developer facing unit tests. You get that to 90, 95, 99. It's a 99% reliability. And your whole system is going to be pretty good. The classic testing way to do it was to put everything together which didn't work, and then try to test the quality of the system level. That was suboptimal. I guess I would say. But the unit test way of thinking, unit test. Identify the inputs. Identify the outputs. Identify the transformations. Write the code to do it. Develop seams. Test the seam level. Fantastic. Brilliant. Wonderful. Try that for customer facing testing. And particularly if you believe in just sort of walking the user journey for your tests, your tests are going to be slow, they're going to be brittle, you're not going to learn much, and they're going to break a lot when you make changes. You need a different kind of thinking when it comes to those other kinds of testing and frankly, Michael knows a lot more than I do when it comes to things like accessibility and usability, an entirely different skill set. It's only one chapter, but we at least could put enough in there and really know what's different and know where to go for more.

[00:14:35] Michael Larsen Specialized testing. The reason I wanted to get into that is because when you've tested for a while, you can develop the hubris to say, I can test anything, right? Because I'm a tester, put me into anything. I can figure it out. And on a superficial level, that's true. But there is domain knowledge. There is certain things that if you don't understand the underlying framework of that, you can get yourself into trouble and or if not necessarily get yourself into trouble, you can struggle with being effective. And I had that experience. I went into a business that oh yeah, no, the software side of things and all that, that was, I could certainly do that. But really, this was a company that specialized in touch screens and touch devices. And what were they testing? They were testing the laws of physics. They were testing capacitance. They were testing resistance. They were testing environmental factors that I struggled with, and it got to the point to where one of the people there was basically saying, you know, Michael, I've been looking at your resume now when your manager says, I've been looking at your resume. That's a terrifying conversation to have. And I'm sharing this with full openness, that this was a very real time in my life in this, where they were saying, you tested routers because you worked at Cisco Systems, you did a lot of networking type of stuff. You do that. I don't see that you've worked with this type of scientific physics stuff before. And he was right. I hadn't. And he says, look, I'm not trying to say that I want you out or that this is going to be a bad thing. I just want to counsel you that honestly, with the number of the things that we're going to be looking at and number of the projects are coming up here. If you don't have a background in theoretical physics, some of this stuff is going to be very difficult, and you might really find that this job is going to be painful.

[00:16:54] Matthew Heusser I think that's sort of the are two perspectives kind of merge and hopefully get a better thing for the reader. A lot of times in testing, over the years, I've seen a lot of testers say, we don't do that. We don't do that. We don't do that with scope management. Like, no, no, no, no. We just do this little thing here. Well, when you do that, your value is lower. It's really, really easy to do the three things that you do. But look out because someone's going to come around and try to promise to management to get rid of you. And we got fruity DevOps, Lucky Charms. And you don't need that because we're doing A/B, split testing in the cloud with cloud. And all of a sudden you ain't got no job. And by the way, I speak this politically way of speaking. I think it's funny. But I'm trying to make a point. But what you can say is. Yeah, I know how to do that. I can figure out how to do that. If you become the person in the office, how are we going to do that? Guess we're going to hire a consultant. Those guys charge like a whole lot of money. I don't know. I can figure out how to do that. If we can put enough in the book that you can say, I can figure out. I know what it is. I could figure it out, go get references, I got contacts. We can at least provide you to tell you whether we're in over our heads or not. If the thing falls apart when four simultaneous user hits, we got a real problem. And if it doesn't, maybe we could take more time in doing this research. And I think that's really valuable to have that approach to it. I don't know it. I'd probably figure it out. I know enough about it to be dangerous. I think that's really valuable. And I think the book, I think we did that. I think we keep that promise in the book.

[00:18:42] Joe Colantonio I love that about the book, Matthew. So, you mentioned something that makes a good point. When I had a real job, five years ago, basically been unemployed for five years. But when I had a real job, I never waited for people to tell me to do something and say, this is all you do. I would get a book like this. I would learn a technique, I would try it on my own and then show my team. Is that the approach you recommend as well, using a resource like this? It's like you said, it would make you more employable as well, rather than just being pigeonholed by your team. Find ways to add value within your team because they don't necessarily know what they don't know. And by reading a book like this, you might get the knowledge that all this technique is going to work. Yes, we don't do it. Yes, it's not what we do. But if I imply it, I know I'm going to add more value to my company.

[00:19:26] Matthew Heusser I will tell you while we wait, I will tell you another story that it's part of it is in the book. But I'm going to elaborate it for our audience. And I worked at an insurance company, and it was relatively large software organization. It wasn't Microsoft, but it was. We had enough teams that we had specialized roles so there'd be a problem, and the DBAs would say, that's the web devs problem. Then the web devs would say the production of support team needs to work on it. The production app support team would say to its application development issue and application development, data warehouse, data warehouse guys would say it's DBAs. And this was not an uncommon problem, to the point that they created a technical project management team just to take responsibility for things and figure it out because the business side project management, they couldn't it was like speaking Greek. The technical project. At least, we have computer science degrees. It's like, wait a minute. No, no no, no, no, it's a problem with the index on the database. That's you. And in theory, but in practice, we still struggled with that. And one of the things I noticed is this happened, it's a little embarrassing, a little awkward, but I'll share it. One of the things that happened, I remember my last day I left the company I was having like the goodbye party kind of a thing. And my manager, who was a lead, so he had projects he was running. It was also kind of a celebration. Kind of one of the projects he was on. We all went out to eat and he mentioned one particular project, Project Foo. And he said, yeah, I wish I could take responsibility for, the technical project management team. We were inventing, taking responsibility for it, but I really can't. No one could have foreseen what the things that went wrong. And my jaw dropped because I had emailed him a list of exactly what was going to happen four months before any of it happened. These are the problems. These are the people who are not delivering their stuff. These are the things in the Gantt Chart that are blocked. And I do not believe these people will be successful if they continue with this approach. It was detailed, specific, it was organized. Everything went wrong. What's interesting about that guy. And this is awkward, he kept getting promoted. He would leave organizations and under difficult circumstances. I know in one of them. Security escorted him out, and he would land somewhere else as a mid to senior level executive, get stuff on his resume that didn't make sense. Stuff on his resume that was Ascii word for word, cut and paste to someone else I knew at the organization. And he couldn't explain what it meant. I'm not sure how much detail I should go into here, but things like reduced business process defects by 65%. And I knew what the other person meant by that. They had a justification. And it was kind of smarmy, but I knew what they meant and okay. And there was a 65% reduction in a certain kind of thing that you could count. This guy just cut and pasted it. He wasn't in the room like, well, but that behavior was rewarded. He got promoted. So the question that I would ask is, are you willing to sacrifice your integrity and morality at the altar of your promotion? Because if you can, you don't buy our book. This book is called Corporate Confidential 50 Secrets to Succeed in the Business Today. Michael and I chose a different path. And I'm not saying we're perfect. The one thing about the ethics chapter is sounds kind of preachy, like we make mistakes all the time, but we at least give you an ethical framework so you can make decisions so you can sleep at night. And I think that's worth something. Oh, and the reason I told that story was actually going to zoom way back out is that there was an organization where you could succeed by saying, we don't do that. That's another team. He said, it's not my responsibility. And like his team was created to make it their responsibility. But another way to do it is and the people I know that have been there are still there for a long time. Many of them. What they do is I can figure it out. I'll do it. Yeah, I'll do it. No one else is stepping up. I don't know anything about Oracle 9.7 drivers legacy working with this system. And we got to upgrade it because that's just ancient. The wheels are going to fall off and it's not supported in the cloud. But I'll try to upgrade the box and see what happens. And then we'll try to run it and we'll try to put it in AWS once it's upgraded. I'll do it. The person that takes on that responsibility, even though it isn't a test responsibility long term, we think has better outcomes.

[00:24:46] Joe Colantonio Give a chapter on words and language about work.

[00:24:50] Matthew Heusser It oftentimes comes down to if you're not saying the same thing or if you have two different understandings of what something is about, you could be on the wrong path and you could be focusing on the main thing or what you think the main thing is. And you could have totally different reasons. And mind you, there's a little bit of if you will CYA, cover your butt if you will. And in some cases, people will be very guarded in their speech, or they'll say things in a way where there's a lot of equivocation. In other words, I want you to do X, but don't dig too deep into X, because if it makes us slip our schedule, or if we have to somehow do something different, that's going to be a big problem for us. So you run into that, we refer to it as weasel words or Oh, I didn't actually say that. I didn't mean that to happen. And we give an example of some corporate situations to where somebody did do that or we needed to say on schedule, we need to make sure that something happened. And by virtue of that, a very big issue was uncovered. And it was a big scandal. I'm pretty sure that that was VW that we were talking about in that regard in regards to their emissions numbers, that they had to roll back and say, we didn't do this right. And, well, I never meant for you to do that. That was never the case. Well, can't be necessarily proven, but I'm willing to bet the answer is. Yeah, that probably was the reason, you were under pressure. You had to make sure that your test passed, and so you did things to make sure the test passed. And that's danger. And also oftentimes when we as the testers are in that situation, if something goes wrong with the quality of something, they're going to look at the test team first. It's the old phrase of the tester gets called out on the carpet. Why did you let that bug go through? And in my younger years, I used to just say, oh, I'm so sorry. I can't believe I missed that. I'm such an idiot. Later on I said, I don't know, why did your programmer put it in? They said, no no, no. We're not playing this game anymore. I'm not going to let you take software test and say, we are the reason that this bug went out. We're part of it. But so is your programmers. So are the people that work here. So for whatever reasons come into this and it's important to have those conversations. Yeah, that is a part of the politics of testing.

[00:27:43] Michael Larsen One thing that I think is worth talking about is this. Maybe you disagree, Joe. I think it's really important. And we kind of picked a fight with this in the book. We like the Hegelian synthesis. We like the idea and the counter idea and then work them together to find a third. That's even better. We don't use the term QA. It's all kinds of problems. And yet I'm totally fine with you using the term QA. In fact, you might disagree, but, when people post on LinkedIn, it's impossible to do quality assurance. How dare you use the term QA? And I say, I do QA. I do it all day long. My company does it. We'll do it for you. We're just very expensive. I joke we're actually pretty darn competitive on pricing in the U.S for testing services. But we can do QA because people haven't really, like, really looked at the dictionary definition of assurance. They don't really look at it. When I was married, if my wife said, what time are you going to be home? I could say 5:00. And she'd say, no, there's a roast in the oven. The neighbors are coming over. I really want to know what time you're going to be home. You need to provide me assurance. None of this five, ten minutes late stuff. What time? Do I have the food coming out of the oven? It'll be five minutes after you get home. I need to know when to put it in. I need your assurance on when you come home. Now, if I said 6:00, I'll be home by 6:00. I assure you, I have assured her I'll be home by 6:00. Now, that's the English definition of the word of Assure. If my car has a flat tire and I have to call triple A, and it takes some longer than we thought, and I get home by seven. Is any anybody going to point their finger in my face and say you aren't really providing assurance? You can't claim to provide assurance because you can't guarantee that you'll be home on time. Well, that's not what assurance means, I assured her. She's going to say, I'm so sorry. That's so terrible. I hope it wasn't too cold for you out in the snow while you're waiting for the tow truck. Everything's going to be fine. Everyone is going to understand, right? When we say testing is complete or we've allocated this much time for risk mitigation, these are the things we found. If we had more time, these were the things we looked into. Do you really care about these risks? Given that these are the things we think are the highest risk and these are the things we found, and you want to send that software out anyway. Does it really matter that we keep testing? That's a very mature conversation. We'll teach you how to have it. And you can say, I'm fine with you saying I provide quality assurance. Now, there may be times when you say, okay, the decision is to put this out. I think it's hot garbage. I can't assure you that it's quality because I don't think it is. You get to do that too. But if you want to use the term QA, I don't have a problem with that. We provide a theoretical justification for it in the book. I'm happy to have that conversation with anyone, anytime. I think it'd be a good debate at a conference.

[00:31:03] Joe Colantonio Absolutely.

[00:31:03] Michael Larsen I changed my mind by actually, I don't know, looking up the definition of the word Sure and really doing some work.

[00:31:11] Joe Colantonio I love this book. It's full of models. It's full of patterns. Real world examples, case studies. As people could tell by hearing this, you definitely need to pick it up. Alright, guys. Before we go, though, any parting words of wisdom? Where can people pick up software testing strategies?

[00:31:26] Matthew Heusser The book is on Amazon. If you go to Pact.com and you look for it and you use the code STS25 through, to get through the end of the middle of March, it might be end of March. We'll get it extended. You can get 25% off. And if you get the e-book, which is cheaper than the Kindle ebook, which is cheaper than the real book, you get it from Pact and you don't get it from Kindle then. And you use the coupon if you really, really if you don't want to spend money, you can get it cheap. There are some trade offs involved. If you get the PDF, the links don't work if you click on things, some things. But it's a sweet deal. Let's criticize testing community. We love being super precise a testing community. We say things like, oh, I didn't show prove that it works. I'm not quality assurance. I just demonstrated that it can work under precise conditions. One time. So don't you come coming to me when it doesn't work so well. Okay. That's a very weak claim. My middle daughter, when she was in sixth grade and she was sort of depressed with me, she could say similar things. She's a lot cheaper than most software testers are. We have an infinite number of combinations. We need to select the most powerful tests. I got a degree in math in a hypothetically infinite series of possibilities. Selecting a finite set tells you nothing because there's an infinite number of infinities. So you can multiply infinite by infinity, and you can make your finite set. No matter how big it is. Meaninglessly small. It's impossible. You know nothing. And yet we are tasked with figuring it out anyway. And we can and we do. And this book will tell you how. And I think that is a pretty strong claim, and I'm proud to make it publicly. And if you disagree, let's talk.

[00:33:33] Michael Larsen And I will also add to this in addition to. You can get the book pretty much anywhere that physical and digital books are sold online especially. But also, I want to add that Matt and I have also set up communities. We've got a Facebook group, we've got a LinkedIn group. And heck, if there's another place you want us to put it together, we'll do it there. But we want this to not just be, hey, here's a book, go read it and that's it. We're here. We're active participants in the community. That's something that we've prided ourselves on. It's part of the reason why we've been involved in organizations like AST and myself currently with PNSQC is that I want to continue to be part of the community, regardless of what work I do, whether in the future, whether or not I do more writing or whether or not I do more testing for an organization, or if I just become a full time testing consultant in some capacity. Or who knows, maybe I'll put more of my attention into music production and writing or doing other stuff. My point is, is that this is the work that I've spent 30 years plus doing, and I'm pretty passionate about it, and I'll be passionate about it regardless of if I like you, Joe, if I still have a day job five years from now, I will still want to be involved in this because this is the stuff that matters to me. We want to have those conversations with you, and we have ways to do that. If you want to join the group on LinkedIn, if you want to join the group on Facebook, reach out to us. And Matt and I are both there, and you can find us on LinkedIn pretty easily. And basically our only thing that we say is you show us you've got a book you're in.

[00:35:09] Matthew Heusser You know what? My great disappointment with a rational unified process, RUP, is from what I can tell, they weren't really interested in transference rate. Here's a bunch of ideas. I wrote a big book. Did it work for you? How did it work for you? What did you have to change? A year later, was it still working for you? What did you learn? Where are the case studies? Where are the adoption studies? Where are the I just, I don't think it's controversial for me to say the people behind the Rational Unified Process weren't particularly interested in that part of the story, and we're particularly interested in that. The next step is going to be do people pick up their ideas and use them? Otherwise, it's going to be a dusty old book. You can get it a really big library if you're lucky, and we want to do better than that.

[00:35:56] Joe Colantonio Absolutely. So definitely check it out. Folks will have the link down below along with a discount code. So let's spread these ideas. Obviously, as you can hear from these experts, definitely worth spreading and helping others along the way as well with them. Awesome. Thank you guys. Appreciate your time.

[00:36:12] Matthew Heusser Thank you very much.

[00:36:12] Michael Larsen Yeah, thanks for having us.

[00:36:16] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a455. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:36:52] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Dan Belcher Testguild Automation Feature Guest

Mobile Mastery: Blending AI with App Testing with Dan Belcher

Posted on 04/28/2024

About This Episode: Today, Dan Belcher, co-founder of Mabl and a former product ...

Promotional graphic for a TestGuild podcast episode titled "The Future of DevOps: AI-Driven Testing" featuring Fitz Nowlan and Todd McNeal, supported by SmartBear.

The Future of DevOps: AI-Driven Testing with Fitz Nowlan and Todd McNeal

Posted on 04/24/2024

About this DevOps Toolchain Episode: In this DevOps Toolchain episode, we explore the ...

A podcast banner featuring a host for the "testguild devops news show" discussing weekly topics on devops, automation, performance, security, and testing.

Copilot for Testers, GPT-4 Security Testing and More TGNS117

Posted on 04/22/2024

About This Episode: Have you seen the new tool being called the  coPilot ...