Automation Testing

It’s Automation, Not Automagic [PODCAST]

By Test Guild
  • Share:
Join the Guild for FREE

Welcome to Episode 95 of TestTalks. In this episode, we'll discuss all things automation-testing-related. We’ll also cover how to avoid common testing potholes with Jim Hazen — a veteran of the software testing trenches and a highly sought after testing consultant and speaker. Jim shares some deep automation insight in this episode, so you won't want to miss it.


JimHazenTestAutomationPotholes

I don’t often have the opportunity to speak with someone that has been involved in test automation longer than I have. Jim has more than twenty five years of experience testing applications on the PC and Web platforms. There’s not much he hasn’t seen, and in this episode he reveals the tips and tricks he has discovered while working on hundreds of successful test automation projects. I think Jim summed up this episode perfectly when he said, “I just want to help people avoid testing potholes and making the same mistakes I did.”

Listen to the Audio

In this episode, you'll discover:

  • Some of the biggest testing and automation misconceptions that have been existence since the very beginning
  • Why codeless automation solutions are not always “codeless”
  • Are programming skills needed for today's testing and automation needs?
  • Tips to improve your test automation efforts
  • Why a really good testing or automation practice can usually be applied across any tool or language

[tweet_box design=”box_2″]I just want to help people try and avoid the potholes and make the same mistakes I did~Jim Hazen #AvoidAutomationPotholes[/tweet_box]

Join the Conversation

My favorite part of doing these podcasts is participating in the conversations they provoke. Each week, I pull out one question that I like to get your thoughts on.

This week, it is this:

Question: What are your thoughts on code-less automation solutions? Share your answer in the comments below.

Want to Test Talk?

If you have a question, comment, thought or concern, you can do so by clicking here. I'd love to hear from you.

How to Get Promoted on the Show and Increase your Kama

Subscribe to the show in iTunes and give us a rating and review. Make sure you put your real name and website in the text of the review itself. We will definitely mention you on this show.

We are also on Stitcher.com so if you prefer Stitcher, please subscribe there.

Read the Full Transcript

Joe: Hey, Jim. Welcome to Test Talks.

 

Jim: Hi, Joe.

 

Joe: Awesome to have you on the show. Today, I'd like to talk about all your experience with test automation. Before we do, could you just tell us a little bit more about yourself?

 

Jim: Sure. I started off in the industry back in 1987 as a programmer answer was doing some work on the side for a roommate's company doing some testing, and 9 months later I'm getting hired in to do testing full time … been doing it ever since. I was brought in to help kick start their test group at that company. Myself and the other guy started up, and then a couple years later I left there and went to another company and did the same thing and built up the testing function there. I've been doing this for quite a long time. I got lucky and about 1991 I got a chance to work with some of the first generation test tools. It helped me get back on doing some programming. I've been doing it since.

 

Joe: Awesome. You have a lot of experience, and I'm just curious to know, since 1991, is there anything that is still the same in automation that you see over and over again that you wish has changed, that you've been fighting … or not fighting with, but trying to help fix since then?

 

Jim: Yeah, and that's what I've been doing with my speaking tours, trying to fix some of the misconceptions that have been around since the beginning. A lot of times we have people that just jump in blindly and don't do some of the front end work or they're put up against misconceptions and false expectations that are difficult to overcome. There's still a lot of that going on. Everybody thinks grab the tool and just start scripting and you're all good. That's not true.

 

Joe: Absolutely. It's kind of funny … not funny, but one of your recent comments on LinkedIn was talking about code-less automation. I was just curious to get your view on that, because every time I hear code-less automation, it makes my skin crawl. I don't think if it's because I'm a little bit older and I think of solutions that didn't work in the past. Maybe there's some new magic technology that works, but what are your thoughts around code-less automation technology?

 

Jim: Good question. The thing is with some of the people and vendors that are putting that out, I think they're coming out with some good technology. They want to do things to help users get into doing this type of work and make it easier to do and help try and [inaudible 00:02:24], but the way that they're marketing and selling it, in my opinion, just based on my experiences over the years, it's incorrect. They're helping to add to the misconceptions and false expectations. That's what I would like the vendors to realize is, guys, the way you're selling it isn't sending the right message. We need to send the right message first, so that we have a chance to succeed.

 

Joe: Do you think that message is that it's still code? Eventually you're going to have to customize it somehow, and that there's no such thing as just code-less. It's because testing is a thinking activity, I would think you would need to … I just don't believe in straight up code-less. What kind of misconceptions … What do you think vendors oversell when they say code-less technology or code-less automation solutions.

 

Jim: Well, that you don't have to know how to program. As I've put on posts to some of the vendors is that in some way shape or form you are still programming. You are still writing code, but it's in their tool, in their specific languages. As a part of that, you've got to understand how to construct it correctly, otherwise you're writing things that are not going to be reusable, are not going to be maintainable, and then you're right back to square one, to the problem that they're trying to solve, which is making things reusable, making it easier to maintain. You're right, at some point you're going to have to do some custom work in there to get it to behave and do exactly what you need. It's not just record, playback. It goes back to the record, playback problem that was created years ago. As you said, it kind of magically happens. That leads into my favorite saying, which is, “It's automation. Not automagic.”

 

Joe: I definitely agree. Speaking about making tests maintainable and reliable, what do you recommend … Are there some pillars that you always recommend to everyone that's against automation? Here are the five things you could do or four things you could do to make your tests more reliable, more maintainable?

 

Jim: Yeah, what I try and do is always break the problem down. I do the front end work to say what's the technology I'm working with? How does the tool interact with it? Then I look at the application and say, “What do I have to interact with?” It could be at the GUI level. It could be at the services level. I try and map those things out first, so that I can see are there any common areas or common functions or processes I need to use, and then build things according to that, and those are my building blocks to work off of, so I kind of bake in trying to … the reuse and ease of maintenance because in the first generation test tools … I mean, yeah, they were record, playback. To go in and maintain them and then try to break things down was a real nightmare. A couple of other vendors came along with more of a programmatic approach. Because of my development background when I first started out, I could start doing that again and saying, okay, here’s my main test.

 

Here’s all these other things that are functions that I can reuse. I went forward with that. That piece of it is how do you dissect the application and what you’re trying to do with it, the system under tests. Then look at the data. How can you externalize that out? Because you don’t want to embed that within the test because then… The test itself doesn’t become reusable. You want to try and break things apart. When I was in college, one of my professors had told me think of programming as eating a loaf of bread. You can eat it all in one shot. You have to slice it up and eat it slice by slice.

 

Joe: You brought up a few times you have a programming background, you were a developer previously, and one of the main questions I always get asked about automation is how much does someone need to know programming?

 

Jim: Basic concepts of programming, which is logic structure, how to write functions, and if you can, methods, that are reusable. Again, taking the problem and breaking it down, so being able to build libraries of common code, either for interacting with the application or specific things you need to do within the application … understanding datatypes and logic structures and how to put that stuff together so you have a functioning program. As you can see, we write testware to drive the testing of software. Just those basic concepts … You don’t have to be a high-end Java or C# guy, but just some good solid skills. Then, for the type of applications you’re working with, specifically I deal with a lot of things at the higher level, such as the GUI side and the services side.

 

I want to understand what I’m working with, so if I've got an application that’s a webpage, and it’s using something like Google Web Toolkit, I've got to understand how the tools going to interact with this objects, so I need to understand how I can get to that object, get at its methods and properties properly, and then work with them. You need to be able to also understand how to dissect an application and see not deep down into it, but below that top layer of service, so you can understand here’s how I need to hook it and work with it. Now, a lot of concepts you talk about relate to any language, any test tool.

 

Joe: I’m just curious to know, do you have any particular favorite languages or test tools that you currently use?

 

Jim: Yeah, I have to admit I’m a little bit … How do I put it? I am biased from the standpoint of I’ve been working with commercial tools for a long, long time. I’ve worked with multiple different tools over the years. I tend to stay with that, but also … the programming languages were going to be like VBScript and CU-type languages, because that’s my core background. Some of the other tools that are out there now in the open source … I mean, they’re good tools. They're really maturing nicely and coming along. I didn’t stick within the niche of working with clients that are working with the commercial tools.

 

Joe: Awesome. Yeah, I also started with the commercial tools, but then I started kind of getting more into open source. I see the benefits of both. I guess it all depends on what your team is using and what your situation calls for, really.

 

Jim: Yeah, and I mean I know in the next year or so, I’m going to be working on something with Selenium or something else that’s open source like that because I have tended to stick mainly with the Windows platform desktop applications, client server like that, and web. Now with mobile and some of the other stuff coming up, I’ll definitely expand my skill sets and work with that, but the principles of implementing tools and getting projects going is transportable across tools. To me, these are just tools in different languages that you learn, and then it’s all the other stuff around that really drives how you do it.

 

Joe: That’s also a great point. Hopefully, I just want to make sure that people are aware that any concept … Any good practice for test automation, if it’s really a good practice, can usually be applied across any tool or language. I definitely agree with that.

 

Jim: Yeah, because I believe in not reinventing the wheel. It’s like if you have something that’s pretty solid and works and it’s transportable, use it. Don’t go and continually keep reinventing the wheel because it’s a waste of time. I mean, in the line of work I do with the contracting, consulting work, I’m brought in to bring these up and going and train people, and them I’m back out the door. I’ve got to have something that’s very transportable and quick to implement.

 

Joe: Once again, one of my pet peeves is, for some reason, some automation engineers are just into creating their own frameworks. There are so many, especially if you do an open source testing, there are so many frameworks that are already out there that people could be leveraging and they’re not. It just drives me crazy. I agree with you there, absolutely.

 

Jim: Exactly. I’ve been working with the same framework for the last 6, 7 years. The architecture is the same. I’ve transported it between two or three different commercial tools. The constructs are the same … the code because these tools support VBScript because that’s what I’m working in. I just go in there, and then I change some of the internal function calls that I need that are specific to the tool, and I’m up and running.

 

Joe: You mentioned once again a few times that you mainly do consulting. I love talking to consultants because I think you see multiple organizations doing multiple things, and for someone that’s been working at a company for almost 10 years now, I get kind of blinders. I’m just curious to know, how do you go into a new situation and be able to evaluate what their current needs are and what kind of tools will meet those needs? Are there certain criteria you use to help educate management to make sure they’re on the right track and they don’t get some misconceptions? How do you deal with any kind of automation misconceptions with new organizations that you’re consulting for, I guess?

 

Jim: Well, you just hit it right on the head. It’s when you go into a project and to a new client, you need to sit down and talk to them. It’s not just talking to the testers. It’s talking to the developers also. If it’s internal, you want to talk to the users [inaudible 00:11:44] and things, but also talk to the management, so you get their expectations and perceptions of what it is that they want the project to do. If you do see some things right off the bat that are their misconceptions and false expectations, you need to start the process right then and there of starting to correct those. Before you even write a line of code, you're doing the front end social work and getting things straightened up so that when you do come along later on and say, hey look, here’s how we’re going to do this, they’re more receptive to it. You’re not saying, “Well, so-and-so said this on this article,” or, “The tool vendor sales guy said this,” and it’s like you get into that situation, it’s tough to get them to turn around and basically wake up and realize that’s not the reality of the situation. It’s going in and, like I said, doing the front end social work to find out what you have in front of you, who you need to talk to, and what problems you need to fix right from the beginning.

 

Joe: Awesome. Now, I know you're going to be presenting at STPCon this year. I believe one of your sessions is called Demystifying The Test Automation Pyramid. I find this … a lot of teams struggle with this, this test automation pyramid concept. I think a lot of people get it wrong. What is the session about at STPCon this year?

 

Jim: I’m going to try and take the concept of the test automation pyramid and start breaking it down and get back to the original roots of what Mike Cohn was trying to do with it because when he came out with that people jumped onto it. It kind of got a life of its own. Then later on Mike put out another blog post after it was in the book saying, no, this is kind of what I meant. This is what’s going on. Again, there’s some life of its own type situation. The principal … I mean, admittedly I met Mike Cohn years ago. He’s here in the same area I am. I met him years ago during a local group meeting for software testing. We’re talking about testing in relation to Agile. This wasn’t even before the [inaudible 00:13:42] came out, but he talked to me a little bit about automation and what I thought. I don’t know if that influenced it or not, but he talked about it from his standpoint.

 

He wanted developers to start doing more of their own work, and start doing unit testing, so that they could produce more stable, more reliable code to the test group and test people further down the line, so that they wouldn’t find what I like to call stupid bugs, you know, things in code that they should have found if they had just done some cursory testing of it before they hand it off to the test guys and say go for it, and the system breaks for some stupid reason. He also wanted to do it at a level, so that they could really prove out the code and make it more robust. By natural factors, your level of granularity, or as some other people would say [inaudible 00:14:28] the atomic level of the test is more focused. It’s more directed to a 1-to-1 relationship. You're going to naturally, as you go through the code, you're going to produce more tests. That’s why you have that wide base at the unit test level on the bottom of the pyramid. Then you’re building on top of the services, and then at the top, the UI layers or what some people call acceptance test level.

 

By natural processes, yeah, it’s going to fall out that way. There’s that. Then the percentages some other people threw in and said, well, this is what we think the percentages are. That was kind of like … The guy admitted that it was kind of like something that I just kind of threw out there. I want to go back and try and correct some of those things, and say, hey look, yeah, it’s a building process. It’s layers and things like that, but the way I look at it is it’s not really a triangle in itself or a pyramid. It’s more of concentric circles of growth around a core thing.

 

Joe: Can you expand on that concept a little bit more?

 

Jim: The concentric circles?

 

Joe: Yes.

 

Jim: Okay, I’m going to get myself in trouble on this one. Well, the reason I say that is I guess a good analogy is think of the earth itself. You have a core there. everything is built on top of it. The core itself is a certain size, and then you have the next layer up, which is, again, a percentage of the size of the thing, and then on the outside you have this thin layer going around the whole thing. It’s building out from the center. That’s kind of what I’m trying to put on it. I’m not trying to create something new, but just give people a different way of thinking about it.

 

Joe: Right, so this is how I understand it. Maybe it’s just, once again, my situation. It drives me nuts, but I definitely agree that there should be more unit tests or at least faster tests. What I’ve seen over and over again is people always go for that quick eye candy of the end-to-end automation test. There’s so many other things that could be addressed better below the layers, or at that core center. If they were building up the core center more, they wouldn’t have the need for this higher, this thinner layer, which is kind of really thick now of GUI tests, because they’d have a nice core of faster, lower-level tests that are fast, quick, reliable, and more appropriate, rather than just creating an end-to-end test because the developers didn’t put effort into actually doing unit tests. I don’t know, is that what you’re getting at with this kind of concept?

 

Jim: Yes and no.

 

Joe: Okay.

 

Jim: I could agree where you're going with that. Like I said, the focus and granularity of the lower-level tests have different purpose. Now, do we need to have all of our tests be that high-end functional business rule, end-to-end type tests? No, because what people have a bad tendency to do is to put everything in the kitchen sink into those types of tests. They're trying to do too much in them instead of making them more directed and have purpose. They're trying to pull the trigger on the shotgun and hit everything on the target. When you get to that level, you need to design your tests and focus them to really what you're trying to prove. With the GUI tests, if you've got unit tests and services level in place, you just want to run tests just to make sure that at that top level, what the user sees, and where'd you expect interaction to happen happens correctly or not.

 

You're not trying to test a field to see is this field bounded by a length of 64 characters and it only handles alphanumeric characters and things like that. That’s unit test level. They can do that. Services can do that. If I enter data into one part of the application and it gets saved to the database, and I go to another part of it, and it should come up and be presented the same. Yeah, that’s a directed end-to-end type test. You’re validating something that’s going on there from that perspective. Now, if you go in and have that in that same test, go and test every field and every permutation, combination and things … No, that’s a bunch of bloat. It makes the tests very fragile and unmaintainable.

 

Joe: Is there anything you’ve seen more and more in the coming years? Do you think it’s going to become more critical for testers, that they need to know any technologies or any techniques you think are really up and coming and people should get behind now?

 

Jim: Yeah, I would say definitely the mobile is taking off because now we’re seeing a real increase in speed of everything going to a mobile platform. The applications are becoming more sophisticated. The interfaces are more sophisticated. It’s not like it was 10 years ago with your cellphone, where you have a text-based interface on there. Now you have full GUI type things on the interface on that LCD screen on your cell phone. Just like I experienced years ago with going from DOS-based applications over to the Windows and OS/2 world, it’s a big jump. There's a lot of other factors you need to take into account in how you work with it.

 

As a part of that, I was just talking to somebody the other day that the problems that we had years ago with DOS and early versions of Windows as far as version compatibility, configuration, things like that, are happening all over again. What’s old is new in the mobile and a platform like that … it’s just that the speed and demand is so much more. I thought it was crazy back then. Nowadays, it’s just totally insane. Understanding how to do configuration compatibility testing and then learning how to work with those different platforms, that’s really what’s coming up. Some people say now desktop computers are going to be dead in the next couple years, laptops a couple years after that, and then within the next seven to ten years, everything will be on a mobile type device, which wouldn’t surprise me.

 

Joe: Absolutely. I’m looking forward to the virtual reality revolution.

 

Jim: If Microsoft has its way with their virtual-reality stuff, yeah, we’re all going to be walking on with goggles on, and while we’re trying to avoid somebody on the escalator, we’ll be working on a spreadsheet.

 

Joe: That brings up an interesting concept. How would you test something like that? That's not a traditional user interface. Do you think it's more of an API level test? Have you ever thought about that?

 

Jim: Something like that, yeah, would need to be because up until recently, and you've talked about this because you've talked to the guys at [inaudible 00:20:50] Tools. A lot of the test tools, they're not designed to handle that type of interaction with a system. A lot of test tools are blind. What I mean by that is they can’t see the differences like the human eye can and how a screen is presented or the color differences and resolution, things like that. A lot of the first-generation tools tried to do that, and they couldn’t. They failed miserably because there is just too much variation there and the logic in the tools wasn’t robust enough to handle that. We started working with things at the object layer. Now we’re coming back around to more visual type things. These interfaces nowadays, especially on mobile, are a lot more visual. They have that underlying layer there that you can hook into. The combination of having to do things more from the API services layer, and then also some of the tools that can do that visual work. Hopefully they’ll improve a lot over the next couple years. That’s definitely something that’s coming up.

 

Joe: Performance testing is something that I think people are still almost old school about, where we have more agile practices going on, trying to test more often, quicker. I don’t see performance testing moving as fast as that. Have you seen anything in performance testing that you recommend people do nowadays that may have been different than what we have done maybe 10, 15 years ago?

 

Jim: In relation to working within an Agile environment, you need to be involved early on. you need to sit down and talk to people and say, okay, what are you trying to get this application to do? What are the service level agreements you need to meet? Is this something as far as transactional load? Is this something as far as the UI must respond within a specific period of time? It’s a lot of the same things, but again, you have to shift it to the standpoint of how can we get in earlier on. Instead of trying to do the big, huge performance test, you break it down into smaller pieces and work with it and then build it up.

 

Joe: Awesome, definitely agree. I think more companies need to start doing this because it’s too late once you have a finished product. I like how you said … You should start as early as possible, even doing little proof of concepts with different architecture maybe, before they even decide on what architecture to use is what I’ve seen a lot of companies be really successful with … performance testing that early in the development lifecycle.

 

Jim: Correct. That’s what I’m experiencing now with a current project is that they were talking about performance testing and kind of stuck my nose in and said, well hey guys, we need to start thinking about this now. As a part of that, we’ve done some performance tests specifically to see if the hardware and the architecture is going to do what we want and scale properly. I recently wrapped up a part of that, and because we did that work, we were able to go to senior management and say, hey, here’s what’s going on. This is what we’re seeing going on. This is how we’re going to improve upon it. They were just like, that’s great. Instead of waiting until the last minute and finding out the thing’s going to fall over on its face, we’re being a lot more proactive and that’s definitely making the money people a lot happier.

 

Joe: Jim, are there any books or resources you would recommend to someone to help them with learning software testing or automation or performance testing that you really think are some go to resources?

 

Jim: Yeah, I mean, I have to go back to the books I started with. Definitely Cem Kaner’s book … I think it’s … Now I’m forgetting the title … Testing Computer Software. Then the book by James Bach and Cem Kaner where they’re doing the comparison. Again, I apologize. I’m spacing out the name. The automation books that I used and started to read and to learn from years ago were the books by Dorothy Graham and the other one by Alfredi Dustin. One of her recent books, Dorothy Graham and Mark Fewster is Experiences of Test Automation. That’s a really solid book because it does a lot of comparison, gives people ideas.

 

Just in general, Google is your friend. When I started out, like I said, back in the late 80s, early 90s with all that stuff, trying to find information was next to impossible. I used to have to go to the local technical bookstores and sit in there for a couple hours looking through things and saying, hey, this looks like good information. Then as the internet started to happen, doing searches and talking to people, and then in the last ten years, I mean just the explosion of information … It's really been great. There's also the converse of that is there's so much information, it's how you sort through it because a lot of people are repeating the same things. It becomes a bit of detective work where you got to go find those little hidden diamonds in the rough.

 

Joe: Awesome. I definitely agree. It seems so simple, but just Google it. I'm shocked by the amount of questions I get asked sometimes. I get asked some really technical questions from technology I have no clue about. I'll Google it. I'll find the answer and I'll give it to them. It's like they could have done the same thing. All right, Jim, as I mentioned earlier, you are going to be presenting at STPCon. I believe it's the first week of April.

 

Jim: Yeah.

 

Joe: What are the sessions? I believe you have two sessions. One is Demystifying The Test Automation Pyramid. What's the other session about?

 

Jim: I’ll be doing a half-day workshop that’s called Practical Implementation Of Test Automation. I know that’s a mouthful, but some of the things we’ve been talking about as part of this discussion I’ll cover in that, so it’s that front-end work, going and finding the right people to talk to, and fixing misconceptions and getting people up to speed on things, but also it has some hands on where working with a tool I’ll show them the beginnings of the hybrid framework approach where they’ll be able to take that base code and expand upon it for their projects that they want or use it as a model.

 

Joe: Awesome. I believe the syllabus is online. I’ll have links to all this in the show notes, so if anyone wants to learn more … exactly what you’re going to be presenting there, I think it’s really worthwhile to go out and check it out and actually catch you in person and pick your brain in person. I think that’s a great opportunity.

 

Jim: Yeah, and that’s why I do these conferences is that years ago I started out … I went to some conferences and we're looking to talk to people that have experience. I’m trying to pay back some of that now. As it says in my LinkedIn profile, I’m a veteran of the software testing trenches, but I also tell people, yeah, I have the scars on my back to prove it.

 

Joe: I’m getting there. I’m not too far behind you, so I definitely understand that point of view for sure.

 

Jim: Yeah, because I just want to help people try and avoid the potholes and make the same mistakes I did. As I said earlier, there’s a lot of pressure to get things done. Don’t reinvent the wheel. If you find something that works and you could manipulate it, do what you want, go for it. Keep moving forward. There’s a lot of stuff out there. You just have to learn to find what you need and put it together and get it do it.

 

Joe: Before we go, is there one piece of actually advice you could give someone to help them improve their test automation efforts and let us know the best way to find or contact you?

 

Jim: The one word of advice I can give to people to help them with their test automation is to use the computer between their ears first. Use your brain. As far as contacting me, I do have a LinkedIn profile. People can contact me through that. I usually respond pretty quickly because that gets routed to one of my personal e-mails.

 

Comments are closed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Symbolic AI vs. Gen AI: The Dynamic Duo in Test Automation

Posted on 09/23/2024

You've probably been having conversations lately about whether to use AI for testing. ...

8 Special Ops Principles for Automation Testing

Posted on 08/01/2024

I recently had a conversation, with Alex “ZAP” Chernyak about his journey to ...

Top 8 Open Source DevOps Tools for Quality 2024

Posted on 07/30/2024

Having a robust Continuous Integration and Continuous Deployment (CI/CD) pipeline is crucial. Open ...

Sponsor The Industry-Standard E2E Automation Testing Annual Online Event (Limited Spots Left) - Reach Out Now >>