Welcome to Episode 88 of TestTalks. In this episode, we'll discuss becoming a next gen tester with the internationally known speaker, leader, writer and tester Mike Lyles. Discover what it take to succeed with test automation in the 21st century.
If you want to stay employable in these changing times, you can’t be a factory worker tester. In this episode, Mike shares with us why the days of factory workers are over — and how you can become a next gen tester. Discover what it takes to move your testing to a whole new level. Learn how certain brain games can make you a better tester and how practices like visual validation testing can increase your code coverage. Get ready to take notes on testing tools and best practices you can put into place right now.
Listen to the Audio
In this episode, you'll discover:
- How visual validation testing can overcome some recognizing patterns and potential gaps that your current testing effort might miss.
- Techniques that can be used in becoming a better visual tester.
- How to evaluate your current measurement program for gaps and areas of improvement.
- Suggestions for training ,coaching, and collaboration with the testing community that will enable your team to be successful
- The importance of staying relevant in a fast growing technical world.
- Much, much more!
[tweet_box design=”box_2″]#Testing #metrics tell a story not a number~@mikelyles #TestTalks[/tweet_box]
Join the Conversation
My favorite part of doing these podcasts is participating in the conversations they provoke. Each week, I pull out one question that I like to get your thoughts on.
This week, it is this:
Question: What testing metrics do you use to tell your projects story? Share your answer in the comments below.
Want to Test Talk?
If you have a question, comment, thought or concern, you can do so by clicking here. I'd love to hear from you.
How to Get Promoted on the Show and Increase your Kama
Subscribe to the show in iTunes and give us a rating and review. Make sure you put your real name and website in the text of the review itself. We will definitely mention you on this show.
We are also on Stitcher.com so if you prefer Stitcher, please subscribe there.
Read the Full Transcript
Joe: Hey Mike, welcome to TestTalks.
Mike: Thanks so much Joe. Thanks for having me.
Joe: It's great to have you on the show. Today I would like to talk about all your experience with testing and automation but to get started could you just tell us a little bit more about yourself?
Mike: I've been in IT for 23 years. I started out as a developer, did that for several years, moved to the PMO, did program management office for awhile, decided that was not for me, moved it back over to development as a manager. In development really loved that, loved coaching people and being in a development role and then as part of one of my training courses in development I got involved in a testing training course and it really changed me, it sparked an interest in me to be part of testing.
My organization at the time did not have a testing group but I still implemented testing practices within our organization. Moved into having an organization for testing. I moved into that group and was part of the group that built it out and built it from the ground up. Then held several roles of testing and relating to testing. I did test management, automation, performance, actually led some test environment groups, data management, software configuration.
Then moved into a new role right after that and just got into being a test architect for another company for about a year. Where I'm at right now I'm working with folks leading and managing teams that are involved in automation. We've done some performance and also just test execution in general.
Joe: Anytime I hear that someone was originally a developer and then got into testing I have to ask them, what drew them to testing, what made you want to get involved more with testing?
Mike: I really like that with testing you can really analyze things to the endth level and I didn't get that opportunity as much with development. I really enjoyed development but with … you know, when I was a kid I took things apart and wanted to see how they worked and I felt that that curiosity led me to the testing role. I really love testing things. I'm constantly testing things whether I'm actually doing it as a role or as a job or not. I, you know, constantly finding something broken or not working the way that it should, through life and especially with apps or phones and on our computers now. I consistently catch myself just testing things in life.
Joe: Very cool. As I was stalking you I noticed you're also speaking at STPCon this year.
Mike: Yes, yes.
Joe: You have three sessions. I'd like to touch a little bit on all three sessions, if that's okay?
Mike: Okay, okay.
Joe: The first one is visual testing and I actually was introduced to you by Moshi from Applitools, so is this visual testing, visual testing using Applitools or is it just visual testing in general?
Mike: You know, the idea came to me … a friend of mine was talking to me and we were talking about Brain Games, the game brain, I mean the TV show the Brain Games has a lot of interesting shows if you've seen it, where they will try to trick you. They'll try to show you a picture here and then they'll show you another picture and they'll say what changed and you think you know and then all the sudden they said the whole time that you were watching this there was a clown in the background and you didn't' even notice that because you were so focused on that one area.
That got me thinking, I think we do that as testers. If we're testing one area and we're testing that thing so often that if something new comes along, we put our focus over to something new and we forget that something might have changed in something that we were really familiar with and you get so used to it, it looks similar, you don't catch the small changes.
I know with Applitools their tool actually, does a lot of that catching that people's eyes usually don't see, the naked eye just doesn't catch. My course on visual testing has done really well. I've done it just a couple of times for a couple of conferences and what I find is, people are really shocked in the class to find that they aren't as good as they think they are. I think it's an eye opening experience for testers to realize that because once you start feeling confident, you feel that confidence that you can do everything without any error in yourself and you're going to find everything you need to find, that's when you let your guard down and things start happening that you didn't expect.
I think the course has been really, really exciting and I try to bring new things in and add new things as I … every time I do the class at a different location or a different conference. I try to add some new things. Recently we talked about how that just being in a stressful situation takes you and then makes it very difficult for you to handle the situation that's going on while you're having that stress handed to you.
We had some folks try to play some games including a free solitaire app we downloaded for them, which took some concentration and had … I had some volunteers who I explained the game to them, while this person was outside. When they come in, I wanted them to really, you know, be loud and obnoxious and make some noise, turn on their phone, play music, do whatever, while this guy was trying to do it and compare it to another guy who was trying to do the same puzzle, same effort without any bother at all. What we noticed was people perform better when they're not under that heavy weight.
Kind of look at it, not only from what you see, but also how the brain works. We spent some time on the neurology, I'm a psychology minor and I think psychology has a lot to do with testing and how we work in IT. I think it's benefited me along the way.
Joe: Awesome, yeah, really interesting. I definitely agree with you. I think a lot of us have our … everyone has biases and we all focus on certain things and if you have a certain tool sometimes, it will highlight things that you never would even looked at or notice. I think that's a good point. Are there any other things or any other techniques you think people could use to help them with their visual validation activities, like to recognize patterns or gaps with certain coverage they may have for their test?
Mike: I think a lot of times folks need to, I guess step back from what they're working on and spend some time in another area and come back. I think that might be an area or a way that you could keep yourself fresh. I know a lot of times I'm told that if you stare at the screen long enough you're eyes will get tired and they'll get relaxed and sometimes it's best to get up, look around, blink a couple of times and clear your mind, so your eyes will refresh.
I think as a tester, we get so caught up in, this is exactly the way the screen's going to look, that field is there, I see the field the way that it should be there but what they don't recognize is the field may have moved just slight, you know, centimeter across the screen. Maybe a word is wrapped around or a word is not the same size or the same font. Maybe that's not a big deal for some applications, but to other applications it might be a very big deal.
I've seen that happen and actually with testing myself, I've seen situations where programs that do screen scrapes and look at the actual application that you have, maybe a point of sale is depending on things to be in the right location and the right area. If this program is looking at it and it's trying to map the screen, it'll say, it's not the same and it starts crashing and then you have to start investigating why that happened.
I think being aware of just because something looks okay, just because something cosmetically looks okay, there may be things depending on that to be in the right position or the right size. I think that makes a big difference and like I said, I think sometimes just stepping away. A lot of times in my writings and articles and things that I've written in the past, sometimes I feel it's best to just walk away for awhile and come back because when you do that then you can … you're not stuck in the middle of it and then you can get … did I really write that, you know, and look at it one more time and say okay, that wasn't really how I meant that, so it's good to step away sometimes.
Joe: Awesome, so speaking about psychology and neuroscience almost, another topic a lot of people talk about and I think can be abused almost is metrics. Almost … sometimes if you have a metric, people will game-ify it so I know … one of your other courses or one of your other workshops is based on metrics. Are there any common metrics you think every tester should use or know about?
Mike: That's a tough question, you know, I've spent a lot of time with several people talking about metrics and I've been speaking of metrics for a long time and when I started it was, you know, when I was starting the testing group I was new to it and as kids do, kids learn from experience. They touch the hot oven one time and then they learn from that, so. I think in my situation I definitely was touching the hot oven by trying to come up with all these lists of metrics and looking at, what metrics to do I need, what metrics do I not need.
What I realized was, it depends, you know, and it's really more about, what is your story. What kind of things are you trying to tell and with my course, it's called, metrics tell a story not a number. That's indeed what I'm trying to do is say, it's not about telling a number or putting out metrics or trying to meet the norm that everybody else is using. So many people come up with you know, how many test cases did I run, what percent complete am I, how many failed, how many passed? I mean those tell you something, I just don't think they tell your stakeholders what they need to know.
I spent a lot of time with Michael Bolton, who graciously gave me as much time as I wanted, more than he should have on metrics and also with Paul Holland. The both of them are really passionate about being able to tell the story without just getting hung up on the numbers and having everybody else's cookie cutter metrics.
I think it's really working with your stakeholders and I've done this with several stakeholders that I've worked with and they're pleased by that. They like to hear the story. I heard Keith Klain once say, that if somebody comes to you, a CIO comes to you and says, tell me the status of testing, you're not going to say seven. I've always thought of that, you're not going to just throw a number out there and we still try to do it with our reporting.
I think it's best to give a picture of where testing is. Give an overall story to them that they can understand and they can say, now I understand where you are and I understand how things are going. It's tough to give these set metrics are the right ones or these … you know, people talk about defect removal efficiency and the percentages and how many have I ran.
I've looked at test effectiveness and people trying to say, okay I want to move the needle so I want to find defects earlier because it costs more or it costs less and there's a lot of discussion on that. I used to be on that side of the fence where I would say, you know, if you find them earlier and cost less, I'm not really sure I believe in that anymore. I think in my situation, I think we need to find out what are the metrics that are important to the group.
What are the metrics that we're trying to tell to the stakeholders and more importantly with the metric like I found all the defects in this cycle in development or I found it in early testing and I didn't find it in UAT or right before production.
I think the problem with running numbers like that is you end up having a large set of numbers and someone says okay I found 90% of the defects way back in the first cycle of testing but what they don't tell you is that 10% was the most critical defects. The story's lost in that number because you don't hear that the real big issues weren't found but, we found 90% but, you know, the 10% were the ones that were critical. I think that's where we get lost in the numbers.
Joe: Such a great point and I guess … I don't know if you have any tips on how we can convey that message to managers that we report to? The reason why is that I worked with a lot of companies, a lot of groups and I know one group that has a metric of automated tests versus manual.
It came down from on up, this guy wasn't even a tester, they also had 80% automation coverage and it is the worst metric in the world because people just start creating really stupid tests that just bump up their automation metric. It's just … it doesn't make any sense yet they have this dashboard and this is across all the teams, that have this number and it's like you said, seven. There is seven, oh, it's not 80, you're in trouble, what's wrong, so how do you change that? Or how do you help facilitate better understanding of what metrics really are trying to do. Maybe if you had just 20% coverage but it's 20% of everything that's risky. That's probably better than 80% I would think, so I don't know what your thoughts are on that?
Mike: You can send me in two directions here. I'll go down the first one with the automation. I worked in a company for a short period of time that was so focused on automation numbers. I want to get 2,000 scripts written and whatever it takes it has to be done by this date.
It was to the point to your exact situation you had, that the automation folks started writing scripts and splitting things out. Splitting your cases in half and having two of them so that they could meet the quota so that we could get to that goal. I kept saying to the management over us, when you get completed with this, what are we going to … what is success look like to us and … is success that we met a number that we tried to reach because I feel like that that's all that we're trying to reach. Exactly that's what it was but we didn't become a successful delivery of the product.
We didn't have a successful delivery of the product. We just met a number and we met a goal within that organization. We didn't improve the overall product. We didn't show any value. You know, with automation, I have become a believer in the discussions on automation. Automation has a place within testing and a heavy place but it doesn't replace testing. In fact, there a lot of discussion that it's not testing, it's checking and it's implementing the … you know, doing things for people so that they can do the more hands-on, exploratory in depth testing that you might not be able to automate.
I think the thing to … I look for automation to do is, if I measure automation it's not how many test cases I wrote or how many scripts we executed this week, which I've had to do and reveal those numbers. It's more about how much time did my tester get to spend digging further into another path of testing that he wouldn't of got to do, had he had to do this one piece over here that automation is helping support him. Is taking that time away from him to have to do that part, so we can go over here and do the other work, you know, did we double up his work? Did we get 200% of execution work done because of that and I think that's where most organizations fail.
As with that, so I'll segway to the second answer, I think with metrics it's very difficult in the course that I'm teaching at STP in April, I want to bring that out to the folks and explain to them, you know, how do you handle that? You know, my boss says I've got to run my numbers. I've got to give my percent complete, my number of tests executed, how many passed, how many failed, what's my automation coverage, what's my automation numbers, percent of automation, all of that. How do you change that? It's very similar like when you feed a child, the first time they feed them macaroni, they don't know if they like macaroni. It looks strange to them. Once you touch it to their lips and they taste it, they're like, oh this is great and they eat it. I've had to do that twice.
I think that's what you have to do. You can't just go to your boss and say, you will eat this. You know, you will take these metrics as … because I've listened to conferences, I've listened to trainings and I know how it should be. This is how we should do it because when you do that, they're going to go, absolutely not, you're not putting that in my face and I'm not going to eat it.
If you slowly show them the benefit, show them how it works, show them, you know, hey how about we add this to our reporting next week and they start seeing it. All the sudden one stakeholder says, that's an excellent number I like that better than I like your 2%, 10%, 8 of 10, all of that. Then you start seeing your management saying, okay bring it in, bring it in more.
Joe: I definitely agree that is so frustrating. I think I am that guy that goes, I talk to everyone everyday from Microsoft. I speak with, you know, Mike Lyles and they're all saying this and your wrong and this is why we should it. They kind of tune me out, it's like definitely see your point there but it's hard.
Mike: Yes, yes. You get labeled as someone who listens to all the big names and at first it kind of irritated me but after that, you know, I take respect in the folks that I've met. I've been very fortunate to meet some really good people in testing who have really taken time out. I felt in testing more than development, I've met some people who just take some time out if you reach out to them. It doesn't matter where they are on the ladder they will take the time out and talk to you. I've appreciated all the folks who have done that for me.
Joe: I agree. It's a great community and I think it's like you said, for some reason, I think testers … I guess everyone is, but testers I think by nature they tend to be really curious and they're always learning and by learning they usually like to share, in my experience.
Mike: Yes, yes.
Joe: I guess as a few points here, metrics, I don't know, I don't want to keep bashing metrics but it's just such an area where you know, I was speaking to someone, Todd Gardner in episode 79 and he mentioned he's been on projects that had 100% test automation coverage and the test still failed. The reason why, is no one ever talked to the customer to say, “Hey would you actually buy this if we built it?” I just guess it's a common thing that you must encounter all the time when you're giving this workshop, these type of stories. Do you have anything similar to that? That you've encountered or heard?
Mike: I almost always find that the room is almost always divided. You'll have people in there who are automation engineers. Actually it's divided three ways. You have people in there who are automation engineers, who are hard core automation people, who feel that their product is the savior to “manual testing”.
Then you'll have folks who are saying, you know, I don't need automation, automation is checking for me, it's helping me, it's building in regression, doing the regression testing mostly and doing some other checks for me while I'm able to do exploratory testing, context driven testing, other practices that help me dig deeper into unscripted areas of the application. Where I can really experiment with things and as James Bach says, galumphing, move around and just look around the application and try to find spots where there's problems. That's something automation just won't do for you. You can't build that in.
Then you got people in the middle, which are the ones that you really … you want to help them pick a side. Those are the people who say, okay, you know, I believe that automation is helping me check the applications, helping me do checking, it's really helpful to my organization but I also believe that I have to do a lot of my manual integrations and work as well.
I think those folks are more open to hear what you have to coach them on and to teach them. I think back to your question on them, you know, how do you help them with that in a metrics class. I think what I find is folks are really unsure how they're going to tell their story for metrics. They're unsure how that they can send their message.
What I always try to do is make sure that I try to give them some steps to, how do you go to your, you know, your management and their senior management and above them and explain to them, here's why we need it this way. Because I think it's really more coaching them on how to tell the story because I tell them, there's no … if you came here looking for me to give you 10 of the top metrics to use, you won't get that because I don't know your organization and I don't know what's, you know, the critical things that you need to be telling your stakeholders.
One thing that's important to one stakeholder may be different to another. I can tell you the best tactics and the methods on how to get in there and get their attention and show them the value.
Joe: Awesome. I definitely agree. You know, I love test automation. I'm all about test automation but this 80-20 rule of having 80% automated, it actually, it backfires on them like you said because the people in the sprint, they only focus on their automation and no one actually pulls up the application and looks at it and tries things and thinks of things. That's the most important piece is, this is a medical application, you better, you know, think of things and try it as a real user and try your edge cases because we're going to be in trouble but this metric almost drives bad behavior even though it may look good on paper. Metrics can be dangerous in a way almost sometimes.
Mike: You know, it's funny that you say that because one of the things I bring out in my class is how the metrics changes people. How metrics change people. The reason I say that is, we had a metric one time that was put into place early and it was forced on me and we were going to run, how many test cases are written per day by a tester and how many are executed per day. It was a requirement in the group that I was in. Given to me, your team will write this many test cases per day, they will execute this many test cases per day. For example 25 per day is what was expected of them.
I've always taught this in my metrics class, that if you measure people with metrics, you're immediately failing because now you're changing their behavior because of that metric. They will change. People will change based on how your … if they know they're being measured like that.
We had a situation with an application that required a third party to be integrated. Third party needed to be up. 25 tests were sitting there, the tester went and ran test number one, it came back as a failure but it was because the third party application was not up and running. They logged that as a defect, you know, application not running, I can't get to it, it failed.
They knew that the next 24 were going to fail because the application's down. They can't get to it but they knew they had to get their 25 so they go and they run it 25 times. 25 tests, they logged the defects and we … that was my way of going to the management and saying, do you now see why I'm telling you that metrics against our team is not going to work and against people is not going to work.
That slowly faded away and become a non-metric, for our group.
Joe: That's a great example and it actually … that breaks people down almost where they're just like, I'm just going to do this metric and it's just so frustrating. I can go on and on about metrics I guess but I think we covered enough right now. I guess let's switch gears now and I guess your last workshop you'd given at STP is … it's called STPCon right?
Mike: Yes.
Joe: Is … leading next generation software testers?
Mike: Yes.
Joe: Just by that title alone, I didn't even read the bullet points, what do you mean by that? What's this workshop cover?
Mike: You know, one of the things that I saw and I don't remember where I got it, but most of my classes and courses that I do, they hit me when I'm watching a TV show or I'm seeing a movie. Brain Games really got me with visual testing. Test manger survey and then some courses I did on that years ago came from just seeing the different types of test mangers and test management professionals that were out there by going to a conference.
What I saw with this leading the next generation of testers, it really got in my mind when someone was presenting on a TV show or somewhere I was looking, it showed how the millennials are infiltrating the workforce and within so many more years, you know, the baby boomers and the gen x and the different age groups and the different generations are going to be moving out and millennials will be taking over.
They were talking on the news channel about how that, you know, these … you have to treat millennials differently, not that they're so … that it's a bad thing but it's just people … generations behave differently. I started doing some research on, how do you work in generations. It was more about looking at just generations in general, more than just testing but I did blend that in with it, with the course. I have a … I have some tables in the course where I say, okay, if you're … if you have a team and you've got, you know, your baby boomers working on your team, this is what is going to motivate them. For the most part, you know, studies have shown, this is what your millennials are going to be motivated by.
I saw a stat that really stood out to me that made me laugh. It said that almost 50% of millennials said that they would rather give up their sense of smell over an item of technology. That blew my mind. Then I thought, I would probably not give up my iPhone if someone told me I had to give it away, so I can understand that.
I looked at, you know, how does that differ, if you've got a team of folks working with you and you've got three people who are from the millennial generation and you've got one person who's a baby boomer and one person from another generation, how do you integrate those teams together and be able to manage them differently when you're one on one with them and how do you integrate those teams together? I really worked with them to … I mean I really wanted to find out. I did some interviews and did a lot of research on this and I've got a lot of that in the training course itself.
Then also I wanted to coach some of the people that were there on … you've been in testing but if you're leading and your managing and you're a manager in testing now, it may have been years since you have been actually a test, you know, in testing and actually hands-on testing. You may not have been here when we were doing mobile testing or web testing or some of the new things that are out there now. You may be now, you know, having to … your team may be doing things that you didn't have to do. It's good to keep up with that and understand where the … where technology's going and where testing is having to go to keep up with it.
Then most importantly, the one thing I would stress there is, I think we've changed as the world has changed and testing has changed. It used to be that if you were in testing, you had a test manager, that guy didn't put his hands on the application or that lady didn't put her hands on the application. Now, I think with where we are today, to be a successful leader and a manager in a testing group, I think you have to understand what your team's doing and be able to put your hands in there with it and be part of that, you know, hands-on work.
I've really had a lot of fun with that course. I've taught it just once and I'm doing it again with STP this time as a larger workshop in April. I'm going to do more, how to build the team, how to do team building, how to make sure your teams gels together. I'm really excited about what we're going to do there to coach people how to be leaders within their organization. It'll be more of a leadership course but it still has a lot of coaching on just what I learned as a tester and a manager of a testing teams and here's what I can help you on and where you can go to keep yourself growing and learning.
Joe: Awesome. Great stuff there, you know, I'm so tempted to go off on my millennial rant but I'll leave that alone. Some points you brought up that I think are interesting is that, I don't know if you agree with this, but I've noticed lately with more Agile teams there's more developers to tester ratios. They usually have like one tester to six or seven developers and so I think a tester almost needs to have more leadership coaching collaboration skills. Is that something you think a tester should be working on? They're almost shepherding a bunch of developers through a sprint to get software out the door that's quality. Is that something you cover or something you would agree with?
Mike: You know, that's a very great point and I think … I'm glad you brought that up because I think it's something that we see, you know, a lot of folks will say, Agile is not going to change the way I do testing but it should because in an Agile methodology world, you've got smaller releases, you've got folks really cranking out a lot of different releases. Whereas back in the old waterfall days, you know, you waited until development was done. Then they handed it over and then testing did their work and then we tested it until it was ready to go and then it was released.
With Agile there's so many things going on over and over and over and over. You want to be more integrated with that team and that communication and that collaboration with the team is more critical now, I think, than it ever has been in history of testing. When I did my test management evaluation I asked folks and I had three, almost 300 people take a survey and as part of that survey I asked, you know, what's the most important skill for test managers. The first one obviously was test management skills but number two was communication.
I think that was because just being able to blend in and work with the development team. Then you got to get that development group more closely aligned with how you're doing your work. You end up having developers that are doing some testing and you end up having some testers that are working with the development team closer.
I think, you know, I saw five trends to look for in 2016. It came out from, I think SmartBear had sent that out. They talked about how that, you know, working with testers and developers, developers will be more involved in testing now and that we should be looking to have our testing tools be more developer friendly because … and I'm seeing that. I see tools that are now, you know, they're advertising. I've even, I've had that happen to me just a month ago. Advertising, you know, your developers can use this as well. I think that's … I think that lends to where Agile's allowing us to be. I think it's beyond Agile. I don't think it's whether you're an Agile organization or not I think, as an organization you really need to be working with your development team so that they understand what you're trying to do. You understand what they're trying to do. I think that collaboration is more critical now than I've ever seen it be.
Joe: Awesome and I definitely agree with you and I definitely see that trend also with the merging of testers and developers using the same ecosystem, same tools, same IDE, same languages just to help facilitate the collaboration that you get when everyone's using the same tools and there's less context switching between going back and forth between different tools if they're using different languages and tools.
Mike: Yes.
Joe: I usually ask, are there any books or resources you recommend to someone to help them with their testing efforts. I guess one would definitely be to check out your workshops at STPCon. I believe that's in April?
Mike: Yes. First week of April.
Joe: Awesome. Also as I was stalking you I noticed that you're working on a leadership book. I don't know are some of these topics something you're going to be covering in that book? And when will that book be available?
Mike: I am, you know, I decided back in 2002 I really wanted to be … I wanted to write a book on leadership and there's so many of them, it's very difficult to figure out how you're going to be different because you need to have that one thing that different that gets their attention.
I've got a lot of things … I've read an author named Jeffrey Fox, who has book called How To Become CEO. Excellent book if you've not read it, it's a very short book and I've actually met him and actually interviewed him for an article a couple years back. What he did is he had his book wasn't a book that you had to read cover to cover, you know, you could go to chapter twenty or chapter five or chapter thirty and you could just jump around because it was like little blogs within the book.
I started putting together a lot of notes and I felt that it would really be helpful to people to have it that way. That's what I've done. I've really … I've got them all out there, got all my notes all scattered and now I've got them compiled now online but I've got to get them put together in some form and get them published this year.
That's my goal is to get out it there because I think a lot folks can benefit from hearing a lot of that … the coaching advice and the mentoring. It's more about managing and leading, it's just about you know, how do you work well with people and how do you improve your self constantly.
Joe: Okay Mike, before we go, is there one piece of actual advice you can give someone to improve their software testing efforts? Let us know the best to find or contact you.
Mike: I'm reachable on Linkedin, Twitter. I even have a Facebook page, the Mike Lyles page out there. I would say the advice that I would give to folks is, don't be that factory worker. That person that goes to work, punches the clock, does your job, and there's nothing wrong with factory workers, my dad was, but don't be that guy in testing that's a factory worker that goes in, punches your clock and does the work. When you punch out and you go home and you don't think about work because you don't have to because you're a factory worker.
Be that person that goes out there and you go to work everyday, you do your work but then when you go home and you're doing your time at night and you're spending your weekend at home, do that research, go out and read blogs, follow the right people on Twitter that are talking about testing daily. Do weekend testing events. Do, go to uTest and practice with them. Learn these tools, take their free downloads of the tools so that you an actually have hands-on because everybody that has these tools almost always gives you that free trial version. You can get experience right there, hands-on.
There's so much material out there. There's so much training, so many things you can do and you could really have your own in-home education and testing and really be good. If you ever need advice on that and anyone needs advice on that I would be glad to tell them the things that I've learned along the way. The paths I've taken and had to back out and go another path.
I've learned a lot just by studying, going to conferences, integrating with the right people at the conferences, getting to know the right names. Like I said, those people are going to be very helpful so I think just constant education in testing makes us really good at what we're doing. There's so much that we can reach out and use.
Comments are closed.