About This Episode:
How well are your engineering teams doing with your Agile practices? In this episode Rudolf Groetz, a test lead and Agile coach, along with David Heitzinger, Head of Agile Engineering, share their thoughts on how to enable your teams to accelerate their Agile transformation. Discover the Agile Engineering Maturity model and apply it to your groups to coach them to achieve better results. Listen in to learn how to coach your teams to Agile excellence.
Exclusive Sponsor
The Test Guild Automation Podcast is sponsored by the fantastic folks at Sauce Labs. Try it for free today!
About Rudolf Groetz
Rudolf has been working in IT for 30 years and is a passionate software tester, works as an agile engineering coach in the Test & Test Automation department at Raiffeisen Bank International in Vienna in the field of software testing and lives by the motto “Test automation is not an act, test automation is a habit! In addition to professional articles in various magazines, he also provides the community with conference presentations and organizes the Vienna Agile Test Automation Meetup with more than 1000 members.
Connect with Rudolf Groetz
-
-
- LinkedIn: rudolf-groetz-6955b9160
- Twitter: RudolfGroetz
- YouTube:UCwLC7HIZG-sEM9iXwqpKpqg
- Github: groetz
-
About David Heitzinger
David Heitzinger is Head of Agile Engineering Support in Raiffeisen Bank International (RBI) and drives together with his Agile Engineering Coaches the agile (engineering) transformation in RBI towards an adaptive organization. Agile Engineering Methodology is among the key success factors for agile delivery tribes. Bringing this mindset on the top of the list of product owners and other stakeholders is currently the main challenge in RBI. David has extensive experience as Software Developer and Architect, together with a long engagement in agile topics, are the ingredients to tackle this transformation.
Connect with David Heitzinger
-
- LinkedIn: david-heitzinger-2b0960175
- Website: David_Heitzinger2
Full Transcript Rudolf Groetz and
David Heitzinger
Joe [00:01:58] Hey, guys! Welcome to the Guild.
Rudolf [00:02:01] Hi Joe! Thanks for joining us.
David [00:02:03] Hi! Hello, good evening.
Joe [00:02:05] Hey David! Hey Rudolf! Great to have you both. I guess before we get into it is there anything I missed in your bios that you want the Guild to know more about? David, let's start with you.
David [00:02:13] Yeah, I think you have already mentioned the most important stuff. My name is David, responsible for engineering in our company. And at the moment we are in the middle of an engineering transformation and we have a lot to do in terms of test automation, testing, test strategy. It's a huge move that we are planning and that we are doing from a really classical software development lifecycle, classic testing to really come into this DevOps setting.
Joe [00:02:46] Awesome. And Rudolf.
Rudolf [00:02:46] Yes, so as you already mentioned, I'm an Agile engineering coach in David's department. In addition to that, I am also the Guild lead for our Test and Test Automation Guild. So now maybe you'll think so, “Ah they are working according to the Fortify model. So no, we do not call it Fortify model. We call it adaptive organization.
Joe [00:03:11] So why don't you use the Spotify model then? What have you done that makes it different?
Rudolf [00:03:16] It is more like this Spotify model. And to see the difference is…so this Spotify model only works for Spotify and not for another company. And this is why we say that it is better that we call it the adaptive organization and it means also that we permanently try to improve our product teams and so on.
David [00:03:42] If you called it adaptive you have a loophole that you also can do some classic stuff, stuff on the side.
Joe [00:03:52] So David you mentioned your organization is going through a transformation and it sounds like you guys are really in the middle of it now. I guess before we actually dive in there, I mentioned the Agile engineering maturity model a few times, is this your model that you all came up with or at a high level what is this model?
David [00:04:08] Maybe I'll start with a few sentences about the why. As you said, we are already quite far with our Agile transformation. And we realized immediately that we also need to invest on the engineering side because you cannot deliver fast and frequently if you don't have your engineering right? But when we realized it, we also said, “Okay, we need to know where to start. And this was the starting point that we said we need a tool where we can assess where are the hotspots. And also let's try to quantify the measures.” We have to quantify the investment. We need a road map and this was the start.
Rudolf [00:04:49] What is the background for this model? Yes, it is our own model, but we started to take a look. So to grab something from the TPI model that TMI and so on, because it started more from the Test & Test Automation perspective. Then we also have to cover the DevOps part and then we brought there is the…And then we have seen other models, one is from Adidas I think so, yeah.
David [00:05:23] Yes.
Rudolf [00:05:23] And one is from another bank. And then we used from each year of these models different approaches. We then created our own questions. And in the meanwhile, we have the second edition in place and currently, we are working on this third edition because we are adaptive.
Joe [00:05:48] Very cool. So when you say adaptive, is it based on a sprint or like every sprint you get feedback, and based on the feedback you build it back into your model?
Rudolf [00:05:56] No, no. It is not according to sprint. So every time when we come back from a maturity model from an interview with one of our teams, then we are sitting together with the other Agile engineering coaches. We discuss the outcome of this maturity model and most of the time one coach then says, “So ah wait. So there is one question which we should also bring into our model. So let's discuss this in and examine our retrospective.”
Joe [00:06:31] Nice, so if someone's thinking to try to get their heads around what this model is what would be the pieces of the model for someone wanting to implement it so then we could dive into maybe each area?
David [00:06:41] Actually, it's quite simple. It's an Excel sheet with questions, but a lot of them around 150 questions. And these questions are from different areas. So we have clustered into CI/CD to make into test general, into test automation, and test (unintelligible). And these questions come always in three categories. The first category is the crawl of the first level. This is where we say, “Okay, this is a must-have, a product, must do it this way.” The second level is the walk where we say, “This is the should be, that the product should do it like this.” And then the third level is the run level when you are really top-notch and you really do the nice things. And we decided to go for an interview, a self-assessment with an interview you could say. We sit together with the team. Go through the questions. The question says yes, no. The other team says yes, no, yes, no. And that's it. We have experimented or tried out self-assessment using some web formula or doing it with the thinking about automatically deriving it out of the systems. But in the end, this situation of the dialog, of sitting together, of discussing the questions proved a really, really good element of this tool, because this is the way how we get feedback and this is the way how we can transport our ambition. And this proved to be a very beneficial side effect. Through this discussion, we can tell the people what is behind it. Why do we want, I don't know, explanatory testing? And this communication, this is really something that proved really worthwhile and helped us in spreading out our goals, our knowledge. And I would value it even more important than like really detailed outcome in terms of figures and measurability and that we know where we are. Really, this communication was very great for us to experience it.
Rudolf [00:08:51] But there is another important thing. So it's not only to foster the communication between the product team or the feature team and the Agile coaches or the interviewer. What is also important, what we have seen that it fosters collaboration within the team. So it could be that one of the developers is, “Ah, we are doing better.” And another developer said, “Wait, wait, wait, wait. This is not what I think we are doing and maybe we should discuss this.”.
Joe [00:09:23] Nice. So I guess what I like about this model is you work for a large organization, a bank. So I assume what it is, you send out these questions, all the different teams and all different teams are at different levels. And then based on their input, your coaches then know we need to focus on this for this team or this other thing for this other team. Is that how it works?
David [00:09:40] Yes, exactly. We are like visiting all the teams. I have a session and the outcome of this questionnaire we share with the team. And we also come up with a bit of a summary of where are we are proposing measures. We also summarize our findings. And in the second round, we discussed with the team and sometimes also with the stakeholders like the product owner or some other stakeholders what measures make sense. In the end, it's the responsibility of the team. Finally, the product owner has to decide which measures make sense, but (unintelligible) that the outcome that the teams know the hot topics that they should invest. And naturally, we are also then here to help them, to give them advice, work together with them.
Joe [00:10:31] So Rudolf, how does this work in the real world? So you send out the questionnaire, you get feedback, you understand the teams' need. You send your coaches out to help team them. They come up with measures that they'll use, is the measures then what you use going forward to assess the maturity of each team?
Rudolf [00:10:48] So first, we do not send it out and get it back. In the meanwhile, it is not that we initiated this interview. In the meanwhile, it is that the product teams come to us and say, “Hey, so can we do an assessment because we want to know where we have to improve? And then we are discussing this in the team, with the Agile engineering coaches then we try to find coaches which have experienced maybe work together with them, that we say, “Ah, this culture knows about this team because he worked for that.” Then we have a meeting, of course, remotely now. And then we are discussing the Excel sheet. Then one day later, we are discussing the outcome in the interview with the team and three or four days later, we send it back to the team that they discussed this and sometimes the teams will immediately jump into, “We want to go to improve our test automation so can you support us there? And then from the coaching perspective, we are asking, “How we can support you with teaching, with upscaling?” What we are not doing easier that we are resourceful, that someone can say, “I need a test automation engineer for 20 days. This doesn't work. But what we are doing is…so it is also possible that the one Agile engineering coach goes into the team, is working together with their test automation engineer, and this has always end date. That means after six to eight weeks, the test automation knows, the Agile engineering coach is stepping back. And then I am alone. That means that he has all the things to do, that he gets all this information to support and help from their coach. And in the meanwhile, we have, I think, two to three teams where we had the segment review where we then discuss all the improvements they did.
Joe [00:13:08] So, David, being the head of Agile Engineering, how do you get people to embrace this? It seems like you already have teams that are actually rich enough to. But how do you get teams and your company really to embrace this, make sure that the survey is being filled out and the teams really are getting benefit from it?
David [00:13:24] I think this is the most difficult thing that you are addressing here because this is a cultural change because what we are actually telling the people is you have to do it differently. Then you have teams who are together already for more than 10 years, 15 years even and they are doing it quite well, quite well. They are working. They are delivering. And then it's really difficult to convince them to do it differently. How to do this? I would say that we work on all different levels. We start with the top management and the top management give strategic direction, test a also a bit of you could say pressure that you go in the right direction. We invest a lot in the community. We have the Testing Guild, the community. We organize events, training. Upscaling is really, really important. And we try to connect people because this creates a pool job and one product team that if people believe they are good and then they see another team and then they say, “Wow, they make it different and they are much faster than we are.” This creating a little bit of a competition between the teams and fostering this exchange is really, really important. But I have to say it's not an easy thing. It's a long journey because you often see the company starts DevOps transformation. They start and half a year later they are finished. What have they done? They have introduced some new tools, but with a new tool you are not a Agile. You are not DevOps. You just have another tool. And it's not that we started yesterday or half a year ago. With this journey, it already started two, three years ago. And it's a long and gradual process to change the mindset.
Joe [00:15:19] Absolutely. And I guess being in the banking, it's more traditional. I'm just assuming I'm thinking I used to work for health care. I work for insurance. I assume it's kind of like banking. So a lot of push back I used to get is like, “We have these 30-year-old systems that they don't work with Agile or DevOps.” Do you hear that Rudolf and if so, how do you help change that mindset if you do get that kind of push back?
Rudolf [00:15:41] When we take a look at this HR coaching competency model, yeah you have different types. You have mentoring, you have teaching, you have technical mastery, and so on. And yes, of course, it is also the job of a culture to change their mindset. And we see teams which are working on the product for more than fifteen years, which are going in the direction of Agile and say, “Yes, we need this.” And then, on the other hand, there are sometimes where the people say, “Yeah, Agile. Yeah, this new fancy thing where you do not need documentation and you are standing every day together for fifteen minutes. So we are doing these for years but we do not need this name.” And then you need the right coach who can convince them what is Agile all about. That this is more than only having every day a fifteen-minute standing up and this is hard. And what we see in the new world with teams, we just start with a new product, and then we have these three real teams which start with a product. And there it is easy because most of these things are fresh in their company, people who were hired with this Agile mindset and it is easy going to discuss being Agile with them.
Joe [00:17:21] So implementing the model seems like you've done it for a year and a half now or two years since you initiated it. Any learning? Do you have a before story and after story of what may be a team you worked with look like before and then after the maturity model, after you coaches, after maybe what the results were?
Rudolf [00:17:38] Maybe David, you can explain the team where we brought the external vendor being Agile. You know what I mean?
David [00:17:48] Yes, yes, yes.
Rudolf [00:17:50] I think this is a good approach.
David [00:17:51] Yes. Here this is a good example. It was one of the first reviews that we did, and it was a product where we had an external vendor delivering software that had to be integrated. And it was also developed by the vendors still. And the outcome was really, a lot of red. It was really not good. And we immediately defined measures and said, “Okay, here we have to step in and work together to improve. And in the beginning, it was really a little bit of a lost cause. Classic vendor, classic processes, own product development, not able to react to customer demands only after half a year or after a year. And we defined several measures together with this vendor. First, we worked on the deployment to automate the deployment steps that we have a pipeline beginning from the vendor when this vendor delivered some artifacts into our environments. Then secondly, we created an acceptance test suite for both where we developed test cases on the vendor side and on our side. And this is like the handshake in the pipeline then. And we also invested in the collaboration. We started to work in sprints. It's not yet that we deliver every two weeks. We haven't decided, but at least we are breaking down this half-year or yearly releases now into two months releases where we have a couple of sprints. We still have a bit of waterfallish elements in it, like the last two weeks are acceptance testing and the business doing manual test but we have moved. We came down from a release cycle from six months to two months. We worked together much better. And what was the business, the product owner, the people who will pay for the product really like that the quality improved the first time? Because one of the issues was that when the vendor delivered the big release a lot of defects popped up like missing requirements, requirements not really understood, staff that didn't work anymore. And it was the first time when we came into this psyche (??) that there was really no major bug found in the end and they really could go into production. And this was one of the things that then convinced also the business that we are on the right track and it really brings value.
Rudolf [00:20:30] Maybe one input from my side is that David says, yeah, so this software was installed in our systems, and then they immediately start with manual testing and doing manual testing for weeks and so on. In the meanwhile, they are installing the software. The installation now not only takes six weeks, it is faster. And then immediately we run our test automation and we verify with automated smoke test suite that the system is ready, that we can go forward with manual test. And the nice thing is this test suite is developed together with the vendor. This is not only implemented by us; this is a real collaboration. We are the vendor is a great thing. They care of the test automation and we care of the test operation. And yeah, my heart is smiling.
Joe [00:21:32] That's awesome. So talking about your heart smiling and automation. You know, when I speak to people, I always assume that everyone has automation in place. How many of your teams had automation? Was it a struggle to get people to move some of the processes to an automation type pipeline or any learning you had working with all these teams to maybe get them up to speed with automation? Do you think people should have to get maybe a minimal amount of Agile kind of transformation starting point?
David [00:21:58] We started on a very low level of test automation.
Joe [00:22:01] Okay.
David [00:22:01] It was really not very common. It was used more in UI oriented products, but everywhere else manual testing and discussed it even today this whole idea One of the big reasons for that is that we had this classic project approach and project management. And when you have a project that delivers at the end of the year after a long project, there is no need for test automation. The project managers always said, “Why? It will be tested only once and some parts, maybe twice. I don't need automation for it.” And now we have to switch to product development. That is a product that continuously operated, maintained, and developed. You have a much higher release side (??) and you go crazy with manual testing because you have to test continuously. And this costs a lot of money. And I think this switch from this project approach to really DevOp and product thinking, this gave us a huge boost in test automation.
Rudolf [00:23:08] So two years before we started, which is with the product teams, we had so-called capability bases and one capability base was software development and one of them was the software test. And if a project needs testers, then they go to the capability base to testers or to test automation engineers more or less not because most of the (unintelligible) was in a manual way. And after this project was done and the manual tester had data, all these test executions and so they go back into their capability base and move to the next project. And now to do this product teams, the product teams have to take care that the software works and therefore they recognize that test automation is a major thing in their (unintelligible).
Joe [00:24:07] Okay Rudolf and David, before we go, is there one piece of actionable advice you can give to someone to help them with their transformations using the Agile engineering maturity model? And what's the best way to find or contact you all?
Rudolf [00:24:20] Take care that you bring these same mindsets into the people, and then take care for education, take care for upscaling. And you cannot learn all these things, by the way. So one important thing is upscaling, and test automation is a craft and you have to learn this.
Joe [00:24:41] Absolutely.
David [00:24:41] From my side. I have already emphasized it a couple of times. If you're using maturity tools like this, you shouldn't have only measurability and figures and management and all this stuff in mind. Yes, especially in big companies it's necessary. You need it. You need some kind of tracking the progress. But this is only one side and the other side is this contact with the team, this communication, this continuous communication. And you have to earn the trust of the teams. You have to have role models to foster exchange. And this is really the hard part that the mindset changes and the cultural change. But without it, you will not be successful. You will get fake engineering and fake numbers and you'll think that you are doing something. But in the end, you do the same that what you did the last ten, twenty years.
Rate and Review TestGuild
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.