Software Testing

How to Improve Your Testing Efficiency

By Test Guild
  • Share:
Join the Guild for FREE

As testers, we may be working very hard (and really late!) on our testing efforts. But are those efforts as efficient as they can be?

Are there ways for you to improve your testing coverage and save time? According to Melissa Tondi, the answer is yes.

Areas of Improvement Your Testing Efficiency

I’m sure all development teams have areas of improvement they can focus on to improve the quality of their product.

For example, Melissa has had the opportunity to build several teams in the last four or five years, and one of the tenets that really stuck with her was efficiency.

She did some deep diving into her current QA teams practices and thought long and hard before deciding (and then seen proven) that there were some areas that were redundant. She also realized that because of either non-existent or challenged communication between some of her other project counterparts, they were repeating some actions, which of course was an inefficient way to test and in turn, deliver software.

What she’s been doing over the last several years is focusing on the day-to-day actions an individual tester would take to approach both their strategy and eventual test execution. She’s then been fine tuning things that were either redundant or duplicated somewhere to the left or right and devising a plan and model to quickly assess those inefficiencies. Those adjustments have enabled her team to quickly execute a plan in order to remove (and hopefully permanently eliminate) those efficiencies from their overall testing approach.

Signs That Your Team is NOT Efficient

So how can you tell if your team is not being inefficient in its testing? A couple of areas Melissa feels are often neglected by testers is understanding unit tests, and understanding what an individual developer might be doing on his or her end to test their work before letting it loose on the testing team.

Some of the first questions Melissa thinks testers should ask their teams especially if there new to the team are:

• What is the team’s unit testing approach?
• Where are the unit test’s results?
• Are the unit test results easy to access by anyone on the team?
• How readable are the unit tests?

Once testers are able to look at this information they can see what has already been tested from a developers standpoint. Testers will than be better able to see if there's anything that you can do to make your testing more efficient from a testers standpoint.

Depending on the structure of how those unit tests are written, her team may or may not have the technical aptitude to understand what they're testing; she has found, however, that a collaborative dev and testing relationship is formed by simply asking, “Are you doing this, and can I see the results?”

That blends into another “red flag” area, where the team itself is either not collocated or not collaborative – although the team’s members don't necessarily have to be sitting right next to each other.

When it comes to distributed teams, there are lots of other tools out there with which members can be more collaborative.

If there isn't a lot of collaboration going on between, say, dev/test and test/product or some of the other pillars of an Agile team, it will ultimately cause some inefficiencies.

[tweet_box design=”default”]Testers should always be early adopters of tools and technology~@melissatondi [/tweet_box]

Has Agile Made Us More or Less Efficient?

Lots of folks out there may be in the process of moving from Waterfall to Agile, or may be new to Agile software testing. I was curious to find out from Melissa whether she feels that Agile has made teams in general more or less efficient.

She responded that she believes it should make testers more efficient, but the reality is that we often get hung up with some of the Agile infrastructure — missing the forest for the trees, or vice versa.

Melissa believes the outcome of Agile should make everyone who is working within that Agile team more efficient.
Ultimately, if you’re using Agile correctly, and really embracing its intent, it will make not only your software testers more efficient, but everyone on the team more fit.

One Piece of Actionable Advice to Make Tests More Efficient

Testers should always strive to be early adopters of tools and technology. We should be continuously upgrading our efforts. If we’re improving ourselves on a personal basis, it will naturally allow us to be a model for others and hopefully, improve our overall SDLC.

If you find yourself repeating a task during test planning — whether it's adding data manually or testing the same type of scenario over and over again, take a few seconds to ask yourself, “Is this something that can be automated?”

Don't let that be the end all, but decide whether or not it can be automated, and where your time will be best spent –assuming that it will save time in the long run.

More Testing Efficiency Awesomeness

Listen to my full Test Talks interview with Melissa Tondi for more testing efficiency awesomeness:

Joe: Hey, Melissa, welcome to TestTalks.

Melissa: Thank you. I'm excited to be chatting with you today.

Joe: Awesome. It's great to have you on the show. See, I'd like to talk about your extensive knowledge and experience for quality engineering but before we get into it, could you just tell us a little bit more about yourself?

Melissa: Yeah. I have spent the high, high majority of my career, professional career anyway, in what I consider software testing, quality assurance and more recently, quality engineering. I've worked with small, large companies, startups. Recently, in the last seven, eight years or so, I've been extensively speaking and writing on topics that talk about efficient testing practices as well as efficient and productive ways that we can adopt agile methodologies to not only improve our testing and our testing delivery but the overall SDLC. That's where I have been in the last few years of my career and I'm always excited to talk with people who are interested in doing the same things.

Joe: Awesome. I guess the first question would be, what is efficient testing, what do you mean by that?

Melissa: A while ago I'd thought about where when I was building teams and I've had a great fortune and opportunity to build several teams in the last four, five years, and one of the tenants that really stuck out to me was the term efficiency. I did a lot of deep diving into current testing team practices and really thought and thought hard and long about this and decided that, and then soon to be proven, that there were a lot of areas that we were maybe redundant or that because of either non-communication or challenged communication between some of our other project counterparts, we were doing things, double, triple and sometimes more times than what we needed to do, which of course became an inefficient way in order for it to test and then therefore, deliver software.

What I've been doing over the last several years is really focusing on the day-to-day operations that an individual tester would take on how they approach their strategy and then eventual test execution. Then really fine tuning things that were either redundant or duplicated somewhere to the left or somewhere to the right of us and then really came up with a plan and a good model to be able to quickly assess those inefficiencies, and then more importantly, quickly execute a plan in order to remove and hopefully, completely eliminate those efficiencies from our overall testing approach.

Joe: Very cool. You've been involved in testing for a while just like I have so I'm just curious to get your opinion. Has agile made testers or are testers more efficient or less efficient, do you think?

Melissa: It's a good question. I think the implication is that it should make testers more efficient. I think the reality in a lot of ways is that we get hung up with maybe some of the infrastructure of agile and where maybe we missed the forest or the trees or vice versa. I think the outcome of agile should make everyone that's working within that agile team more efficient. Sometimes I think that might be clouded over by some of maybe the practices in not being successful with agile.

I think ultimately if we're doing agile right and if we're really embracing its intent, it will make not only the software testers more efficient but everyone on the team more fit then.

Joe: I guess my question is then, you've worked with a lot of teams, are there any signs you could tell from a team that maybe they're not being as efficient as possible as you think they could be?

Melissa: Yeah, and it's funny you ask that because I was just talking about this yesterday. Yeah. I think one of the areas that software testers tend to shy away from is understanding and looking at unit tests or understanding what an individual developer might be doing on his or her end to test their work before letting it loose on the testing team. One of those areas is unit test. Usually, one of the first questions that I ask either if I'm joining a new team or if we're building a new team is what the unit test approach is and where those results are and if they're easily able to be both accessed by anyone or everyone on the team and how readable are those tests so that we can start looking at what has already been done from a dev standpoint and see if there's anything that we can again make more efficient from our testing standpoint.

I think that's probably one of those things and again, we tend to shy away from those. Depending on the structure of how those unit tests are written, we may or may not have the technical aptitude to understand what they're testing but I find that a collaborative dev and testing relationship is simply made by asking the question, are you doing this and can I see the results. That blends into one of the other areas where maybe a little bit of a flag is when the team itself is either not collocated or not collaborative and you don't necessary have to be sitting right next to each other although that's in a lot of ways desired. In distributed teams, I think that there's a lot of other tools out there where we can be more collaborative. If there isn't a lot of collaboration going on between say, dev and test and test and product and some of the other pillars of the agile team, that will ultimately cause some inefficiencies because we're not able to ask or those types of rest results are not readily available to us.

Joe: I guess this is a tricky question. In order for a tester to be able to understand unit test, I think we're talking about the same thing. When I think of unit test, I think like J unit really at the low level. What skills do you recommend a tester should have in order to be able to have that collaboration with the developer?

Melissa: I think one statement that I've kind of … this is one of our tag lines for teams that I've been on is that anyone in IT or software engineering should always be early adopters of technology and it includes the software testing team. In a lot of ways if we are not adopting early on new and emerging technology to include tools like automation or performance or data tools, I think that's an opportunity that is missed and will eventually be a stopping point or certainly a hindrance to more professional career development from a software tester.

I don't necessarily think that every single software tester should have an intimate coding or programing expertise but we really should understand coding basics to a certain level in order to understand what good code looks like and what the eventual outcome of that code should be. I think having that at least basic understanding of programing languages, especially from the tech stacks that we would be supporting and having that desire to be early adopters of technology, both new and emerging, is really what makes, in my opinion, a really good quality engineer. I use that term deliberately but having that natural curiosity and the technical acumen as well as being early adopters of tools and technology, that trifecta of what in my own opinion makes a really good software tester or quality engineer.

Joe: Keeping up the speed is very difficult I think, and a lot of times it's easy to get comfortable in what we're doing. What do you do to stay up to date on the latest technologies to know hey, this is a trend that maybe I should be following?

Melissa: I have my standard onboarding when I, again, if either I'm joining a new team or I'm building a new team. I'm a big believer in not only learning from the company and the teams that you're on on a day to day but at some point in time in your career, especially for those of us that consider ourselves in the mid or later part of our career, that's really when we should be looking to our community outside of our company and project teams to really look at that.

I tap into local meet-ups There are a lot of meet-ups that are webcasting as well so not only local meet-ups but national and sometimes international. I really enjoy a lot of the conferences will allow some free viewing if you're not attending the conference themselves by some of the keynote speakers. Hearing what some of our thought leaders in our industry are saying about new and emerging technology. Always tapping into there and these are relatively free or little cost to the person. It's really just carving off some time out of your day or week to look at and attend those.

In addition to that, of course, there are the standard people that I follow in the industry, people that I consider thought leaders that have created this new and emerging technology then bouncing that out with reading articles, blog posts. There are a lot of really great LinkedIn groups that I will peruse a couple of times a week. I think there's a mixture of things that you can do to fill in the gaps when you're at work to be catching up on that new and innovative technology but there are also things that we can do in person or at least dialing in and taking advantage of what some of the thought leaders outside of our companies and teams are doing and being able to plug that in so being both consumers and hopefully, contributors back to our software testing communities is paramount.

Joe: I really like what you just said about contributing and I think for me that's the best way that I learn. I may not be an expert but by me presenting something, I'm learning and then I get better feedback. I think that's a big point most people don't think about. They're just thinking about consuming and a lot of times I find the benefit is actually not just consuming but actually creating.

Melissa: Absolutely. I think you hit the nail in the head with that. I certainly because I'm a somewhat frequent speaker in the conferences, both national and local, every time I present a session I learn something because I'm asking the attendees of mine and most of the time I'm updating my slides based off of that feedback.

In a lot of ways you would consider that me contributing to the community but I'm also receiving because I'm actually learning as I'm delivering it and something where I may have presented on 10, 12 times I still learn every single time I present that because I'm hearing feedback from others so I absolutely agree with you. One of the best ways to learn something is to teach it or to present it.

Joe: Awesome. I have a link to some of your videos because actually this reminds me of I forgot how you worded it but you started off one of your presentations actually saying something similar to what you just said where you're getting feedback from the audience, you're very interactive with them. I think that was really cool because it wasn't really just a set of PowerPoint slides, it really was you were taking in feedback and on the spot changing your message or modifying it to get your point across, which I thought was really cool.

Melissa: Yeah. Just coming back off of a conference last week, STPCon in Dallas, and then going into STARWEST next week I definitely, I think that's one of the things that I get a lot of benefit from. I always preface. First of all, I make the statements before I speak to just alert the audience that hey, this will be your time to interact with not only me but the people around you as well and I would hope that although I'm the one that's presenting most of the content that I do set it up so that it is very much collaborative and we can have as much dialogue and feedback and questions as the time will allow. I tend to base my content and slides with a great deal of collaboration and back and forth with the attendees in that session.

I think that that's been a really good format for me again going back to the previous statement of learning while you're teaching but also for those that attend those sessions they tend to get a lot more out of it than just what was simply in that abstract for the conference proceedings.

Joe: Awesome. As you mentioned you did speak at STPCon last week and I believe your session was called the Tester's Role: Balancing Technical Acumen and User Advocacy. I hate to put you on the spot but at a high level, what is like the CliffNote version of that presentation?

Melissa: Yeah. That was one of the sessions that I delivered. I also did a workshop and then I was sat on the keynote panel but this one was something that was near and dear to my heart. I think because it tracks the timeline of when I started my career about 20 plus years ago where a lot of the testers were users of the product first and then they decided to take a leap into becoming more technical and took a QA role or software tester role.

We thought like users because we were users and as we fast forward that timeline to where we are currently, 20 years later, where we're seeing a lot more job descriptions that emphasize heavy, heavy technical or programing or development experience with the assumption that any developer can be taught the skill or profession of testing. I talked a little bit about that. I used the analogy of a pendulum which by nature always wants to have that equilibrium balance in the center but that there's always that built in shift from left to right of a pendulum. Really, it's figuring out the best way to balance out the role of a software tester or quality engineering by taking into account not only the technical acumen but also getting back to the roots of the mainstream industry 20 plus years ago where we had to think like a user because we were a user.

Really balancing that there's a place for that technical acumen but not to overbalance that and then there's also a place for that user advocacy by again not overbalancing that either. Always keeping those in check and in balance.

Joe: Now that's a great point. I'll probably edit this up but I guess I'm guilty of this and that is, especially nowadays, I'm so concerned about getting laid off or fired. There is no company loyalty, I don't think, sometimes. It's almost like a catch-22 where you want to learn the application and really know it. I know some excellent testers that were like spot on, knew the applications in and out, and that for some reason they're the first ones out, they get laid off. Now I'm like, I don't want to know the application. I want to know all the technical things that I can use if I ever get laid or fired to put to another job. That transfer over. I know maybe that's cynical but what are your thoughts on that?

Melissa: Yeah. That's a good point and it's so funny. Again, I was having the same conversation with at least two people last week at the conference. I'd say yeah, I mean and we were actually talking about like how do you help manage a person. This was in my leadership roundtable so we were mostly all managers or people who had had some sort of reporting responsibilities with people that reported into them.

It was that fine balance. I mean I think that I … people in larger companies that become product and domain experts, that's a little bit more give and wiggle room for you to create a career based on your product and domain expertise. Smaller companies like if you have a desire to go into a startup, certainly, mode, you're going to need to become more of a jack of all trades where you're going to have to strike that balance more and where that product and domain expertise isn't as emphasized as maybe in a larger company.

I think that by adopting that as technologists we should be early adopters of technology and always having that mindset. I think there's a natural balance that is struck between how much technical expertise do I need and how much product expertise do I need. I really do believe that it's close to a 50-50 split to be honest with you and maybe in some cases it's a 40-60 or 60-40 depending on which way.

I think that it's a pretty good balance between the two and that's in my opinion what I would consider a full breadth of a quality engineer.

Joe: You mentioned you go to a lot of conferences, you speak at a lot of conferences. Another takeaway that I've noticed is a lot of times I get jacked up after a conference. I learn a lot of things and I'm like, yeah, I'm all on fire with testing and new thoughts but when I get back to my organization maybe I lose that steam.

Do you have any tips on how someone can energize and take what they've learned at these conferences or communities and actually inject it into their organizations?

Melissa: What I encourage is either depending on how comfortable you are writing, I encourage informal or more narrative blog post right after. I almost like to carve off time the first one or two days when I get back into office after being at that conference and writing my thoughts down and really putting a timeline in saying okay, by the end of the day, Wednesday, if I'm coming back to the office on Monday from a conference, end of day Wednesday I will have at least one blog post written on my favorite session and why. Because it really makes you … The practice of writing, especially in a blog format lets you synthesize that information more naturally and see it on that wherever you're writing it, whether you're actually writing it or typing it in.

I encourage that. I also encourage for people who may be representing their team or company at a conference is to really have a game plan with their manager or group and their company to say, I'm going to target these sessions and here's why. Then really coming back to the office and having that follow up and saying, hey, how were those sessions? What did you learn? Did it meet your expectations? If so, how and if not, why and what would you change? Really having that expectations setting before you go and then what the outcome is going to be when you come back to the office. I've also had people present their summary at a brown bag the week or two after they come back from a conference so that it's fresh in their memory.

There are a couple of ways for you to be creative on doing that but I get it. I mean I'm always energized at a conference and when you get on the plane and come back and you maybe have a weekend before you get back into the office or something similar, you lose that steam. It's really important to emphasize that that energy that you'd experienced the week before at the conference, there is no reason why the energy can't carry you for weeks after you get back.

Joe: Very cool. You were a speaker at STPCon last week so I'm putting you on the spot a little right now. Is there one thing you picked up at STPCon that was new or a new concept that you hadn't thought about or you learned about that you're really excited about?

Melissa: I had the opportunity to sit in on one of Richard Bradshaw's sessions and he made a very, what I would say, a very poignant statement which is he refuses, and these are his words and I'm paraphrasing of course, but he refuses to use the terms test and automation in the same sentence and there's a reason for that. I thought it was really … It opened my eyes to how the use of terminology without prefacing your use of that terminology, how we get spun up a lot about that. I think whenever we say automation, if we used to say that in our day to day team, at our company, at a conference, wherever we are, when we're referring to automation as it pertains to software testing, we assume that it is the automation of test cases and that more traditional manner.

Really, what Richard was emphasizing was that automation is the efficient use of something that was repetitive or time consuming and so it very well could be the automation of the execution of test cases but it is not testing. Automation in and of itself is not testing. It's checking and that’s really what it is. To lump everything when we use the term automation, to assume that it just means the automation of test cases is really doing automation a disservice to be a honest and I thought that he delivered that statement so well that it really got my mind thinking.

I've essentially even though I just heard him speak just a few business days ago, I really want to take that and adopt that myself and not lump those two together and make sure that when we are talking about automation that we are differentiating and making sure that automation is not considered testing. It is what it is, it's a checker.

Joe: Very cool. I guess there's little points at that, that people may disagree with but I definitely agree. I spoke with Richard Bradshaw in episode 47 and he brought the concept of automation and testing. It changes your frame of mind of what is automation. Automation is helping testing but it's not testing per se.

Melissa: Exactly. A few years ago when I started a testing team we came up with this great quote that I try and include in every one of my sessions and it's automation does not make humans less essential. It makes them more efficient and that's really what it is and we really should be looking at it.

Again, going back to my statement of as software testers and technologists we should be early adopters of technology to include tools like automation. I really emphasized those differences because not all tools are meant for automation and not all technology is meant for test automation or testing. I think that if we don't shy away and we embrace technology we will naturally adopt it in ways that will make us more efficient and it really … I like what you said that, what Richard had mentioned, that it really shifts the mindset of how you're approaching something.

I'm always about disrupting ways in how we've traditionally or have thought for a long time and I think that that's where I see me developing both my professional and career development is looking at ways in which to disrupt the traditional way of looking at things. One of those is using the terminology automation for what it is and not necessarily for what we assume or what our perception of it is.

Joe: Great point, and a lot of people sometimes get turned off with automation testing and they get in these little things. Well, that's not testing and you get into these black and white scenarios. This really actually is better for automation because it gets you to think of things, what other things can we automate that's not necessarily testing that's going to help the team and that's where you really get the benefit.

Your CI environments, setting up environments, entering test data, things that help you with testing but isn't necessarily a testing activity. I think it really helps you embrace more technology like you said because you suddenly look for other ways to incorporate automation and not get hung up on testing, the testing aspect of it.

Melissa: Exactly, and I think that that's where going back to one of your earlier questions about does agile make a tester more efficient, I think by the nature of one of the things that I think agile or at least teams that do agile well. One of the things that it allows is the blurring of the roles sometimes where okay, as a team we are responsible for all of these tasks and some of those may only be able to be done with somebody with a programming experience therefore, a developer and some of them may only be able to be done by a product owner.

There are some things that a tester may be able to assist on like writing code in some ways or helping paraprogrammer with ATDD or something and so blurring of those lines. That removes that silo of well, I'm only a software tester and therefore, I only do software testing activities and the same would hold true for a developer. I'm only a developer and I only write code or I only do things that roll up into my title blur those roles a little bit and say what can I do? What do I have the expertise to do and how will it ultimately be valuable to not only my discipline of software testing but to the entire project team to help us be more efficient and productive and continuously improve?

Joe: I want to switch gears a little bit. A few your videos, a lot of them were on mobile testing trends. I just want to get back a little bit to trends because it's almost, I can't believe in three months it's going to be a new year. Are there any other trends besides mobile testing trends that you've been seeing in these conferences that you think testers should know about or get on board before it really becomes mainstream?

Melissa: Yeah. I mean I think I'm hearing a lot of AI for sure, artificial intelligence. I think that there are a handful of companies that are doing it well and they're what I would say almost bleeding edge in that which means that for more of companies that are mainstream and more consumers of that technology, we should be paying attention to that.

I also hear a lot about dev ops and I think that that we as a community can help with maybe using that term that's custom to how we feel versus what the intent of it was and I think a lot of us would say that dev ops is more a mindset or a mindset and a frame of mind versus a process or a methodology.

I think understanding really the intent of what dev ops means and is, we can maybe get ahead of that curve so that we don't find ourselves in some of the situations where maybe some companies are experiencing with agile. If you look at every company, I bet if you were to ask 100 representatives from 100 different companies what their definition of agile is I bet that they would be different for each one of those 100 people. Slightly different not necessarily glaringly different but I think it behooves us as a community, especially as early adopters, to really understand and do our due diligence when researching a topic especially before it becomes a buzzword but certainly after it becomes a buzzword to make sure that we fully understand that.

I think understanding from an AI standpoint, that's going to be bleeding and then certainly, emerging technology that we'll need to just make sure that we have a good strategy in place. It's a disruptor much like mobile is and was, but really understanding some of these more buzzwords like dev ops and continuous deployment and feedback, and all those types of buzzwords that are coming in and making sure that we're looking at that before it becomes a trend and making sure that we really understand its intent.

Joe: Thank you, Melissa. Before we go, is there one piece of actual advice you can give someone to improve their testing efforts? Let us know the best way to find or contact you.

Melissa: Again, going off of my statement of being early adopters of tools and technology. I mean I think we should always be looking at continuous improvement. If we improve ourselves on a personal basis, that will naturally allow us to be a model for others and hopefully, improve our overall SDLC, whatever flavor of SDLC or whatever method we're taking from a company standpoint. I'd say always look at ways if you find yourself repeating a task whether it's adding data manually or testing the same type of scenario over and over and over again. Take a few seconds and ask yourself is this something that can be automated. Don't let that be the end all but really say, and if it can be automated, where will I spend my time assuming that that will save me time in the long run. Maybe that is you have more exploratory testing sessions or you focus on a particular area that you have time and time again not been able to really focus on that you really want to.

Really, not just saying can it be automated but taking it to the next step and saying and once it is automated what will I do with that newfound time?

I am on LinkedIn, I'm on Twitter. My LinkedIn is just under Melissa Tondi. Twitter also melissatondi, no spaces in there. I've got a blog post up there as well so you can find me pretty much anywhere that any of the social media is going on. I'd love to hear from people because I love to keep the conversation going. As much as I mentioned earlier that I contribute to the community, I really like to hear feedback and I really like to hear what new and emerging trends and approaches that the community is taking and I love to hear from them.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

What is Behavior Driven Development (An Introduction)

Posted on 03/21/2024

Love it or hate it—Behavior Driven Development is still widely used. And unfortunately ...

What is ETL Testing Tutorial Guide

Posted on 03/14/2024

What is ETL Testing Let's get right into it! An ETL test is ...

Open Test Architecture How to Update a Test Plan Field (OTA)

Posted on 11/15/2022

I originally wrote this post in 2012 but I still get email asking ...

Automation Guild '25 Online Event - Registration Kickoff Special (Limited Time) - Elevate your E2E testing skills in 2025 >>