About This Episode:
In today's episode, we have a special treat for you. The brilliant Greg Paskal, a software testing and automation expert, joins us again. Greg has been on the show nine times before, and his wealth of knowledge and experience always brings valuable insights to our conversations.
This time, we delve into a wide range of topics, starting with the importance of skill in development, automation, and testing work. Greg explores the role of AI as a tool and its limitations, cautioning against relying solely on AI to solve all problems. Greg also raises concerns about the quality of content AI consumes for training.
We also discuss the shifts in the current job market and the challenges highly skilled individuals face. Greg offers advice on staying positive and proactive during unemployment, including improving skills and learning new things. Listen up!
Exclusive Sponsor
Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.
We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.
Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.
About Greg Paskal
Greg is a natural innovator, pioneering new approaches across Quality Assurance and Test Automation landscapes. Greg enjoys mentoring others to excel as craftsmen in manual and automated testing.
Author of Test Automation in the Real World and countless technical publications, Greg can be heard on the TestTalks podcast with Joe Colantonio. Creator of METS, the Minimal Essential Testing Strategy, Greg’s approach is recognized and taught by ASTQB as an effective, manual testing strategy.
Speaker at numerous conferences, including the Automation Guild, StarEast, StarWest, Pacific Northwest Software Quality Conference, and QA Trailblazers. Greg founded the Open Test Technology Forum, encouraging collaboration and focusing on greater quality across the SDLC. Learn more about his work at CraftOfTesting.com, METSTesting.com, RealWorldTestAutomation.com, and RecognizeAnother.com.
Connect with Greg Paskal
-
- Company: www.testingintherealworld
- LinkedIn: www.gregpaskal
- Twitter: www.GregPaskal
- Github: www.gregpaskal
Rate and Review TestGuild
Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.
[00:00:25] Joe Colantonio Hey, it's Joe, and welcome to another episode of the Test Guild Automation Podcast. And today joining us once again, we have Greg Paskal. It's his 9th time joining us. The reason why he's on the show, he's an expert. We love having him. He always brings a lot of insight, and today we're going to talk about things like catching up, what he's been up to with software testing and automation, and some key takeaways. He's gone from attending or speaking at some testing conferences a share. And I think he spoke about the seven fundamentals of a successful test team, which will probably dive into as well. If you don't know or you haven't been listening to the show for long, Greg is a natural innovator, pioneering new approaches across quality assurance and test automation landscapes, and he's mentoring others across the board. He also excels at craftsmanship in manual and automated testing. And he is the author of Test Automation The Real World which I will have all of the links to all of these in the show notes. And he's had countless technical publications. Greg could be heard on the previous Test Guild Podcast and he is the creator of METS, the Minimal Essential Testing Strategy. We've covered this in previous episodes as well, and he's also spoken at numerous conferences including the Automation Guild, StarEast and StarWest, Pacific Northwest Software Quality Conference, QA Trailblazers, and a bunch more. And he also founded the Open Test Technology Forum where he encourages collaboration focusing on greater quality across the software development lifecycle, which he's one of the experts in. You can learn more about him and I'll have a link to this at craftoftesting.com and that'll have all the links as well for all his other websites and resources. Also, I think Greg may also be available for hiring even joining a team. If you're looking for someone who's an expert in software testing, testing automation, looking for someone to drive any initiatives or anything like that. Definitely check them out as well. And I have a link there to contact him for more information also. But anyway, let's get to the show. You don't want to miss it. Check it out.
[00:02:25] This episode of the TestGuild Automation Podcast is sponsored by the Test Guild. Test Guild offers amazing partnership plans that cater to your brand awareness, lead generation, and thought leadership goals to get your products and services in front of your ideal target audience. Our satisfied clients rave about the results they've seen from partnering with us from boosted event attendance to impressive ROI. Visit our website and let's talk about how Test Guild could take your brand to the next level. Head on over to TestGuild.info and let's talk.
[00:03:01] Hey, Greg, welcome back to The Guild.
[00:03:05] Greg Paskal Hey, Joe. It's good to see you again. It's been a little bit. I love always and spend some time with you.
[00:03:10] Joe Colantonio Awesome. Same here. We love having you. Like I said, I think you're the only one that's been on the show 9th times now. This is the 9th time, which is crazy. Thank you for that. And I think for each of the community doing things like this over a long time span, it's really a credit to all that you've been doing for everyone. Thank you.
[00:03:26] Greg Paskal Love to do it.
[00:03:28] Joe Colantonio All right, Greg, so do I start off by saying what's going on? What's new? You're usually doing different things. What's the current thing that you're focusing in on and how are things going for you?
[00:03:40] Greg Paskal Well, I'm really in between work right now. And finding the market is really tough out there right now. But I look at that as an opportunity. So I've been diving into IoT work, I've been learning C++. I designed my first PC board and have that produced and I'm really loving it. I'm jumping in to VScode a little bit. So learning how to use that. And one of the newest, coolest technologies is GitHub copilot. Have you heard of this thing?
[00:04:09] Joe Colantonio I have. Yeah.
[00:04:10] Greg Paskal And oh my goodness! I'm one of those people that I kind to need to see it firsthand or there's a hype out there always about this stuff. And so I wanted to give it a first go around. And I'm really impressed by it. So that's a lot of what I've been up to.
[00:04:25] Joe Colantonio Copilot, AI Let's already get into it.
[00:04:29] Greg Paskal Yeah.
[00:04:30] Joe Colantonio You've been around in the industry for a while like I have. And you've seen things come and go and hype and not hype. And so you just mentioned GitHub copilot. You needed to see it in action. Is that what you feel across the board for AI? What are your thoughts on AI and where we're heading now with ChatGPT and all this madness?
[00:04:46] Greg Paskal Well, I'm sure a lot of your listeners know who Jason Arbon is. Jason and I are really good friends. And man, I just got the incredible experience at the Pacific Northwest Quality Conference to spend about five hours with Jason one evening and we just sat and talked. And if any of you have ever sat with Jason, this guy, the way he thinks about things is incredible. And so I always love it when I'm at the star conference, whatever. We tend to meet up at a conference. And that's exactly what happened. And so to me, he's kind of the go-to guy when it comes to AI work in testing. And so I had a lot of questions for him. I went to his seminar and we talked through it and I said, come on, help me understand, I'm trying to separate the hype from what's real. And so he gave me some good things to think about. And some of the aspects of AI in neural networks are so complex, they're really hard to get your arms around. He admitted that he's like we're not fully sure exactly how some of this actually still works. But when I came home, I decided, well I'm one of those people. I like the idea that a human is involved in the work of testing. None of us want to feel like our work is in jeopardy. And I do believe humans have a very important part to play in this. I tend to be kind of like, "Ah! AI! Get stay away". But at the end, I also need to be objective enough to hear what's real there because there's a hype on this right now. And that's what led me to get into it. Like Jason, after I spent a little time, I wrote him and said, here's my initial takeaway after spending ten hours with this tool and maybe your viewers would be interested in finding this out, AI at least the pieces I've seen so far when it comes to something like Copilot is not for a novice to use this as something a skilled developer or maybe an automation engineer uses along with their existing skills. So it gives some good coaching. There are some things I had to tweak, not a whole lot, but enough where I recognize it doesn't understand the scope of what I'm trying to build. It's trying to build me down the wrong path. And I had to make that course correction. I think that's important to know. Some folks might see this as kind of like this magical wizard. The crystal ball here and code is going to come out, but that's not really going to be the case I think. It's still going to take people who have skills in their development, in their automation work and their testing work to evaluate is the advice and getting good? And how is it assisting me to go forward? Think about a carpenter, right? If we go to my garage right now, we're going to find I have about five different hammers. I've got a nice finishing hammer and I've got one that I use when I'm really doing kind of demo work or taking things apart. And they both achieve success in what they do. AI, I think it's going to be like that. You will use this tool to fulfill things that might be very helpful to you, but will it solve all your problems? Probably not. You're going to have to direct and guide it, but where you'll be amazed is where it's able to aggregate other information that it's found and provide it as a hint of what you might want to do next. That's the big when I found it. I like it.
[00:08:00] Joe Colantonio I totally agree. I use it all the time. More like you said, like an assistant or a sounding board or what someone has told me before, like a friend. Sometimes it is good advice. Sometimes it was bad advice. It's up to you to say, okay, I don't know if I agree with that one. All right. I didn't think of that. It's not exactly what I'm thinking. But let me go down that rabbit hole with something else in my own thought. So it's almost like a collaborative kind of session. Yeah, kind of with it.
[00:08:25] Greg Paskal It's a great way to put it.
[00:08:26] Joe Colantonio Yeah. What's your main concern with Jason? Well, Jason told me once that really caught my attention was, Hey if this is going to start generating code, what do you need to do? If you have more and more code, you need to test it. And so there's going to probably be a need for more testers. I know because he has a testing background that that's why he has that angle. Any thoughts on that?
[00:08:46] Greg Paskal I think from a testing perspective or let's just jump back even further from an eye perspective and AI perspective. My question to Jason is, if this is consuming content, obviously it's not dreaming this up completely on its own, not yet at least. But if it's consuming content and I think about sites like maybe StackOverflow, honestly not one of my most favorite sites because you're going to get all kinds of words of advice soon that are going to be good, bad, and ugly. But if it's using that as a baseline, is it presenting not such a great baseline for us to move forward. And so we talk through that a little bit. And I think he agrees there's this reality that we can't be oblivious going into it. Some people will do that and they will probably produce things that you wouldn't want to trust your credit card information to or your Social Security number to or anything that would be really critical data. And so I still believe that because obviously taking what I'm submitting and doing some sort of processing on that, it's probably using external networks and storing some of that information so it can grow its knowledge base. So that's kind of my takeaway right now. I know that, have you ever heard of the Gartner, what do they call it? It's the Gartner hype curve, Joe.?
[00:10:04] Joe Colantonio Yeah. Yup.
[00:10:05] Greg Paskal Wow. This thing is fascinating. At the conference, they talked about it a lot. So if I could explain it to your viewers, it's your classic end curve. I mean, this little curve up here is really tight, and then it's like Schlitterbahn. I mean, it's really goes down and you're just on this roller coaster ride. Right now, AI is right at this hype area right here. As a quality engineer, that's good to know because you need to be aware of what is a hyped trend right now. And if you don't think AI is your naive, it's totally there, guys. And so you have to understand that everybody thinks this is going to do a whole lot of things, but there's kind of a reality will eventually hit or what it actually can do that's normal with all technologies go through that. Self-driving cars are kind of on this part of the curve right now over here. So if you don't know of the Gartner hype curve, it's really worth looking that up because I think it'll give you some good tools as a test engineer of where to be aware of Tech that's coming through.
[00:11:04] Joe Colantonio Absolutely. I'll try to have a link for that as well. A great, great resource there also. And I see your concerns about what it's been trained on. I've been seeing some newer tools come out AI uses like ChatGPT between behind the scenes or other technology, but it gives a citation where it found the information so you can say, okay, show me where you got it and there's a link so you can make sure it's not making things up or elucidating, which is really helpful. But to say, Oh, this is garbage because it's citing this source I know is actually credible.
[00:11:31] Greg Paskal Yes, of course.
[00:11:33] Joe Colantonio Now you spoke about you were at the latest Pacific Northwest Software Quality Conference in Portland, I believe you gave a talk on some of the fundamentals of a successful test team. I don't know, we don't have time to get into all of them here. But any key ones you want to kind of point out before we dive into some other topics?
[00:11:51] Joe Colantonio Well, I think fundamentals the first one I talk about in that class is around the language. If we don't have a common language or speak as a team, then we're going to have trouble. And I gave the example in the conference of the word bug a defect now where I was just employed previously, those two words meant two different things. And I think that's trouble. Those terms for the majority of the quality community don't mean two different things. They mean the same thing. One is the slang for the other. So I'm in this position where I believe it's important to identify terms that the team uses. Now, in this case, they decided to kind of make up one of those terms meant separate from what it actually is taught in the ISTQB content, which I really love. And so that introduced the risk, ironically, because if we hire a new quality engineer to come in and they begin to use that term in the way most of the industry does, you now are not talking apples to apples anymore. The first tip I usually talk through is a common language as a team. And so maybe you go over the basics like a good example, positive, negative, and boundary testing. These are fundamentals. All the test engineers I work with know well enough. They could give it an elevator speech and they can teach it to anybody. It's almost a requirement. But how often do we hear about things like happy path testing? I referred to this in my class. I don't like that term because Happy Path only really leaves one other metaphor, which is Sad Path. And ironically, most defects aren't found at either of those ends. Believe it or not, they're found in the middle. And that's why we have boundary-testing approaches that are taught. When was the last time you heard someone talk about melancholy testing, which would be the only way you could almost make the metaphor work? And this is a case where language is introducing the potential of a risk even that we introduce into the lifecycle by the words that we use. I got to spend some time with our development engineers and introduced that we use these terms. They really like the idea. They could see how important it was that we standardize on good terms and the metaphors matter, right? Joe, you, and know we speak in these weird pictures all the time so that we can help teach a concept. So ensure that as a teacher, you're using a metaphor that works well, the teach a concept doesn't limit it, so language matters. And that's one of the first tips I talk about. I get into automation and some other things how to pick good automation and how to build some beginning test plans and prioritization. But I've taught this class for quite a few years now. It's always well-received and it was well-received at the conference as well.
[00:14:34] Joe Colantonio This is what I like. You told me what it was about, but how did people what were some takeaways that people came up to you after you said, oh, wow, I didn't think of this or this is something I'm going to implement when I get back? Or anything that really resonated with them that maybe you weren't expecting or that you were more than one person maybe highlighted?
[00:14:53] Greg Paskal Yeah, a real common one came up here as well is what to automate? I talked about the idea of not putting numbers to automation. How often have you heard we're going to automate 80% or whatever? I think that's not a good approach to go forward and you should automate the things that are the right things to automate. If that's 80 or 90% or 20%, that's the right answer. Okay. It's not a numbers game when it comes to automation. We talked about in our test planning and I introduced the METS concept, The Minimal Essential Test Strategy. We've talked about that on your show before. And for folks that are interested, you can go to metstesting.com and I've got a whole site on there on that. That is a test planning process that leads you to build prioritization in your test. So you identify a number of categories based on the functionality of an application. And then what would be the critical high minimum low things that you would test? And so the way I coached folks when it comes to automating is considering your critical test that you've identified first as your automation candidates, and you might even find some there that still aren't good candidates. But start there and move to your critical and highs and you may find that your mediums and those they're just not that critical enough to automate because you have to maintain whatever you build, right? That for whatever reason giving that strategy to test engineers, it seems like they have a sigh of relief. At least they've got a plan now and a strategy and it's not, well, I'll automate 80% of everything. For those quality managers out there that are using those numbers. I really want to encourage you to step back from that. That's kind of madness that you would just go ahead and pick a number to do when they may not be the right thing to do. And now you've added overhead and maintenance. Anything you automate you'll need to maintain going forward. So you put that's where something more worthwhile or maybe you automate more than that because it makes good sense. So that was one that definitely came back and it's come back many times, Joe, that people find is kind of an eye-opening.
[00:16:51] Joe Colantonio This might sound weird but do you see big companies as a disadvantage working in insert? I don't know why this came to my mind. You have so much breadth and depth that when you go against what people are saying they should be doing when it's wrong and you say that's wrong, it almost seems like people like with your experience almost seem like a liability rather than like we should listen to this guy. He knows what he's talking about rather than just following orders and saying, Oh, I automated 10,000 tests today and they're like, Oh, look at this guy, he automated 10,000 tests. Like stupid stuff that I don't know. Does that make any sense? Is it just a cultural thing depending on where you're from or where you're working?
[00:17:28] Greg Paskal I think it's a good point. I heard a great talk on empathy at this conference, believe it or not. What an interesting thing to come up at a testing conference. Andrea Goulet, I don't know if you've ever had her on your show, Joe. She would be incredible to have.
[00:17:43] Joe Colantonio No. Cool. Good to know.
[00:17:43] Greg Paskal I did a talk on empathy. I've been in a technology organization. How you do that? And from a quality perspective, I thought it was interesting to hear a soft topic. But her takeaway was if you think about the developers that have written the code, what's one of the most powerful things a test engineer has if it goes to their hands? Is they have the ability to call somebody baby ugly, right? Like the code, the thing they poured their guts into. Anybody who's had any sort of artistic endeavor, whether it be writing code or painting or whatever that might be, knows what it's like to pour yourself into something and to have somebody come by and say, this sucks because of this is really insensitive. And Andrea brought up a really good point. We have no idea what time frame that person was given. If they had been given any training at all. And so I like this idea of empathy in the workplace and that we speak about it because it matters. When I teach the ISTQB class, I often take my test engineers through this exercise where I have them draw pictures and I say, you can drop a flower, a car, a home, and a house. A couple of things. And I haven't placed them all up on the wall. We've got 20 or 30 of these things up there. And then I say, okay, I want you to pay extra special attention to how about you feel. And I start to go through and I'm like, Look at this flower, these petals are clearly not the same size. And what flower do you know only has four petals? This is dumb. And then I look at the car. I intentionally go in and try to do some things that are ridiculous. But I want them to internalize what that feels like. It's important. This is the empathetic aspect I think Andrea talked about, which is, remember, you're still dealing with a real person there that develops that code. And the best thing you can do to reduce risk is to build a relationship there That's better than any test case you're going to run is build communication and trust there. So when you've got to share something hard, you can come in and say, Hey, I found this thing. Can we work through this? I'm not sure if it's a defect or not. And maybe it's not that big of a deal for him to fix or her to fix. And you can go forward. These sorts of skills that I think really separate those that do this as a craft. And those are just kind of getting the job done. And so I think there's a lot of opportunity to grow there. But you're right. The work we do could be, in a way, threatening. Can you believe, Joe? I've been working in testing and automation for two years now. This was the official I almost can't believe it myself because I figure I'm going to get on my skateboard or in a moment, go do some things. But things have progressed. But it can be threatening when you come with some of these suggestions because many quality engineers are given directives like we're going to automate 80% of the things or whatever. I think that sounds really good. Maybe in a board meeting or somewhere where you have to have metrics. Remember as the representative of the quality organization, this role takes enormous courage and integrity to do well. It really does. I've been writing about this to our community almost weekly on LinkedIn. The things we have to say are hard are sometimes very hard things to share with folks, and they can also be very celebrated as well and something we can stand up on. But as soon as we compromise our integrity for the sake of getting numbers done, we've begun to walk away from the very thing we were hired to do. This is not an easy work. It's a hard work. But I like it because it means I need to come with the best skills I've got, and communicate kind of in a caring way. But what I have discovered as a risk so that we can address it and then they can decide whether they want to ship the code or not. That's okay or fix it. But I need a place I could do that. And well.
[00:21:28] Joe Colantonio Great advice. Absolutely. So, Greg, you mentioned another topic about IoT.
[00:21:34] Greg Paskal Yes.
[00:21:35] Joe Colantonio I've been hearing about IoT for years and years and years. And I always say, oh, it's going to tick off, I guess, the hype curve. I don't know where it is if it's gone or not, but where are we with IoT? Is it a skill automation engineer should still be concerned about a thinkable or anything special about it that you think people should know about?
[00:21:52] Greg Paskal Yeah. One of the speakers, I can't remember his last name. His first name was Costa. He did one of our talks at the conference and he said, this is a skill set that's hard to come by in our field. Now, that's good for those of you listening because what that means is we've identified a field that doesn't have a lot of expertise in it, and it makes sense. IoT working with embedded systems. You're working with systems with no screens, you almost need the skills of a good troubleshooter and a developer in order to work in IoT. I found that my testing skills and my development and even my electronics skills, most folks don't know I have an electrical engineering background as well. All come into play in this space because you're dealing with literally IO input and outputs sometimes at a chip level. This coming out as a signal was + or - 5V or 3V in some cases. I found talking with Costa that I was already moving towards where some of the gaps are just naturally, but it helped me to have a background in it, and that is since I'm dealing with chips and boards and things like that, there's like kind of latest board I'm working this my more prototype. This board is made up of a microcontroller and some LEDs and resistors and some other components to test them. I actually needed to write some software to actually test the IO. And all it does is it outputs to a terminal whether certain things are happening the way they're supposed to. You can see how different that is than surfing with Chrome or something. And checking that our data persisted here, we're wanting to see it persist within a small chip or that sort of thing. I like the field. It's very technical and I'm such a nerd for that. I enjoy it. But for those of you that like that kind of thing, there is great opportunity here. And I'm exploring that right now as I'm kind of in the market looking for work is maybe this is a time for me to make that transition more into for more testing as well because I have that background. I think IoT Space Internet of Things is very interesting. It's everywhere, it's all around us and it's incredible to work with. I learned about a technology, I got to tell you, called LORA, L-O-R-A, It just blew me away. The circuits that I'm working on here use WiFi. And have you ever considered that Wi-Fi for the most part, has been contained within your house? It's meant to be kind of a short distance of transmitting high-speed data. Well, LORA is a technology that can transmit data over a mile or multiple miles, in some cases hundreds of miles over through some very small chips. But it uses a different way to communicate that information. And that's brand new to me. Here is a technology this long, and it took that long to find it. I built one. I couldn't believe it. I drove about a mile from my house and this little chip sitting on my desk was communicating with me a mile away in my car. And so these are sort of some of the technologies that are available for those who might be interested in getting into the IoT space. It's fascinating.
[00:25:04] Joe Colantonio That is awesome. I live, I bought some property, so we built like a little mini farm and I can never get a camera to go WiFi camera to go as far as I want to. The gates have to walk really far to get to the gate to open it up for people. I should look at some that has those capabilities because that sounds like it would help.
[00:25:18] Greg Paskal Absolutely.
[00:25:18] Joe Colantonio What language did you use to create that program to test that board that you talked about?
[00:25:25] Greg Paskal C++.
[00:25:26] Joe Colantonio C++, right?
[00:25:28] Greg Paskal Yeah.
[00:25:29] Joe Colantonio What language do you think I know C++ or C probably be skill is Python use it all for IoT as well.
[00:25:36] Greg Paskal Yes, it is. So in the IoT space, Python is a really a very popular language. I mean, Python is a great language. It's a markup language. I find working in C++ and debugging this code is really hard. Some of the hardest debugging I've ever personally done. I literally have to hook up another circuit board through a serial connection, and I have to peek. Inside the physical memory locations of this to determine what's happening. It's easily one of the hardest debugging, like a set I've ever done. But when you look at a markup language like Python or Ruby, which is what I've been using for the last eight years or so, you get spoiled in what it means to put breakpoints in places and watch code execute its way through. When you're working in something that's an embedded system that gets compiled and pushed inside, it can be a lot harder. Absolutely, I think Python is a great way to go. I get started with some of this stuff, like a lot of folks with Raspberry Pi and then the Arduino IDE, which is just a development environment for microcontroller systems, led me down this road. And now I'm working with a great chipset called the ESP 32. And man, I love it. It's so cool. And so I'm experimenting with it more and more and blending that with GitHub Copilot has been very fascinating. So yeah, it's kind of a new hobby and it's something I really love to work on.
[00:27:02] Joe Colantonio I also love you always positive. There are a lot of people that find themselves in a situation now where they've been unemployed for a while. A lot of like I think LinkedIn just laid off 700 employees I don't know anyone has unity before, but they laid off like half of their staff.
[00:27:19] Greg Paskal Yeah.
[00:27:20] Joe Colantonio A lot of layoffs going on. I know. So just curious from your experience, what could someone do to stay positive or stay like not get discouraged? I was laid off years ago. I haven't gotten a job since. But like, how do you keep it up? Because I've never heard you down in the dumps. You always keep moving forward, always learning. Any encouragement you give to anyone else?
[00:27:39] Greg Paskal Well, I'd be faking it if I told you I don't feel down from time to time. I do. I have allowed that to kind of help me focus into our community where I can make a difference. I'm very intentional with how I slice up the day, and I've applied it over 75 companies now, and the job market's very different right now than it's ever been in my career. So you have a lot of folks you'll never hear a peep back from. I have had a number of interviews where it sounded like they were going to go forward and then just crickets. Those things can really put a damper on your spirit for sure and make you wonder, do I have anything to offer anymore? That's valuable. That's just not the case. The way I've tried to approach, Joe is I consider, well, I've got this season to improve my skills, I'm going to go ahead and really present. I love what I've learned in C++. It's a language some people are like, Oh my gosh! You did that and I like it. I enjoy the stretch. And I've been writing a lot. I wrote a new paper. It took me six months to write this paper called 360 Degrees of Risk. It's now published on LinkedIn and on my blog, RealWorldTestautomation.com. That came from this time of really saying, Hey, how can I bump up my game here and step back from what I've been observing over so many years of testing and where are in that case, that paper is all about risk in areas that we often don't consider at all. I think it's great hanging out with the community. I've had some incredible conversations with folks, good friends like Randy Rice and Jason, and a number of people I've known on LinkedIn for a while have reached out just to check in with Chris Trimper and some of our good friends like that. You and I both know him well. He has really been a Godsend to me and have just checked in. I say find your community and let them know what's going on so they can be an advocate but you might be surprised. They may just check in every once in a while and see how you doing because it is a strange time out there in the market right now. There's a lot of fear that AI is taking our roles away. Again, I don't think that's the case. I don't see that. But as with anything, when it's your family and you've got to feed them, those things will come knocking for me later at night and I'm worried about it. And then I'm like, all right, I think you're going to be okay. Just keep moving forward, try to pour into our community, and do the best you can. That's how I try to handle this. Just pour into this group of people. We care about them a lot. We care about the field we work in.
[00:30:13] Joe Colantonio That drives me nuts because it's a lot of people like you a high level with all this expertise. And I still look for jobs that make me lose my trust in the industry or my faith. So if you're listening, make sure you hire Greg and let me know you heard about him here, because it's driving me nuts. It really bothers me that people like you are still looking for a gig. I don't get it. But, you know.
[00:30:32] Greg Paskal Yeah, I appreciate that Joe, it is what it is.
[00:30:35] Joe Colantonio Greg, so any other insights then from the conference before we wrap things up my last killer question?
[00:30:41] Greg Paskal Yeah. I don't want to forget this. I actually was just on a call a couple hours ago with a great group, the Murano group. They work with some ladies in Rwanda, Africa. And are you using them to do testing? And they've been taking these folks that really need a career and a craft and equipping them how to do it well and bringing clients to them to do testing. I love that. And I was sharing it with the CEO of that company just a couple of hours ago, how much I enjoyed the work that they did. And he was asking me about conferences and I said, what stood out about this conference, Pacific Northwest Conference, was that it's all volunteer. And so nobody's really getting paid the big bucks. They're all volunteering to do it. This conference, I got to hand it to them. They wound up paying their speakers to cover the cost of my hotel. They covered my travel. Those are hard things to come by. As someone who speaks at conferences often I had to cover that cost out of my pocket, out of Greg's pocket. And so having a group do that was great. But I think because there were so many volunteers, we found that everybody was pretty willing to jump in and help one another share what they had learned. And so all the speakers that got up, you got a sense that this is something they really did every day and they were just sharing the parts they had, even though they may not have all of the picture figured out. This is the piece they did have. I like that a lot. I think as a community, there's no school of quality engineering. Like we learn this by podcasts like the one we're doing here Joe through Test Guild and so why not be a community that relies on sharing with one another? Let that be something unique about this field. It's a great field to make a living doing. So that stood out from this conference and I'm excited about going again because I love the spirit of the folks that I met that were very willing to give like Jason and Andrea and so many of the others, Costa and so many of the others that I met. Some of them have been on your show, some fantastic people.
[00:32:46] Joe Colantonio Great event. They've been doing a forever Pacific Northwest Software Quality Conference. If you're in Portland, I don't know if they have in the Portland area but you definitely check them out actually ran that there when Covid hit that they did one online that I ran for them Tariq contacted me So great organization definitely a great tip, for sure. Okay, Greg, before we go, is there one piece of actual advice you can give to someone to help them once again with their automation testing or any other software testing efforts and what's the best way to find contact you or hire you, Greg? How can we get you employed ASAP?
[00:33:18] Greg Paskal Yeah, to bring that. Well, I always tell folks I'm really easy to find on the web. Just Google, Greg Paskal, G-R-E-G P-A-S-K-A-L, and you will be surprised by all the things that come up so you can find me through Gregpaskal.com very easily or LinkedIn. That's how you get a hold of me. And I'm really looking for a great world-class company. I really want someone that cares about their test engineers and really building out a world-class automation effort. But I'm kind of the whole package. I love all that space. I like to speak into it. My advice is as a community, challenge yourself to grow. Each Monday I've been writing probably the last 2 or 3 weeks I've been writing out to our community on LinkedIn, By the way, for not Connected. Please find me. I love to connect with all of you on LinkedIn. I've been challenging our community to try to bump a little bit each week. Assess your skills and assess one small thing you can apply an hour to this week that can make you a better tester, a better automator by the end of the week, and keep looking for those opportunities to grow. We need to kind of have the heart of a learner, and that's an important skill that you need to mature. And if you're not doing that, be intentional about your growth. You will find eventually you'll probably get a chance to point to others and mentor them. I really enjoy the opportunity to mentor young engineers. When you get that chance. When someone says, Hey, I'm willing to mentor you. You've got to own it. You've got to own that. This is another part of my talk I had earlier with my friend Dan you can lead a horse to water, but you can't make them drink. And when it comes to quality engineering, if you really want to grow, find someone who can help you. I mentor people online, but I always require them to own it and set it up because I want to see their intentionality that we're putting the efforts where they make a lot of sense and where their time as well spent. So if you're given that like run with it, go with it because it can be valuable to your family down the road in the areas of finances and great career growth. And that's probably what I would tell you stay humble and try not to let everything go to your head and just pursue it in a way that you are a sponge to grow and willing to give to others. You will not go wrong with that.
[00:35:42] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a475. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.
[00:36:18] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.