About this Episode:
Slow loading times can drive users away from even the coolest of websites or applications. But how can you pinpoint where your product's performance issues lie? In this episode, Dave Westerveld shares the fundamentals of performance testing, including tests and tools to use, monitoring, types of performance tests, and more. Discover the fundamentals of performance testing that you need to know to get started on your performance engineering journey. Listen up
TestGuild Performance Exclusive Sponsor
SmartBear is dedicated to helping you release great software, faster, so they made two great tools. Automate your UI performance testing with LoadNinja and ensure your API performance with LoadUI Pro. Try them both today.
About Dave Westerveld
Dave Westerveld is an experienced software tester who has been working in the industry professionally for over 13 years. He has worked on scientific computing software that ran advanced mathematical calculations that could scale up to compute across thousand of CPUs in parallel and has also been involved in various automated and manual testing initiatives and currently works at D2L which is one of the leading online learning platforms. As you can imagine with covid-19, online learning platforms have seen massive increases in load.
Dave loves learning new things about testing and is always trying to grow his skills. One of his favorite ways to do that is by teaching others. He shares some of his learnings on his blog (https://offbeattesting.com/blog/). He has also created a number of different software testing-related courses including a course on getting started with performance testing. The world of software testing is changing and Dave wants to do everything he can to help testers stay up to date with their skills so that they can continue to help with producing good quality software.
On the personal front, he lives in Ontario, Canada, and has a wife and 3 kids.
Connect with Dave Westerveld
- Blog: www.offbeattesting.com
- Twitter: Â offbeattesting
- LinkedIn: Â dave-westerveld-25339a42/
- Git: Â djwester
Full Transcript Dave Westrveld
Intro:Â [00:00:01]Â Welcome to the Test Guild Performance and Site Reliability podcast, where we all get together to learn more about performance testing with your host Joe Colantonio.
Joe Colantonio:Â [00:00:15]Â Hey, it's Joe, and welcome to another episode of the Test Guild performance and site reliability podcast. Today we'll be talking with Dave all about the foundations of performance testing. He also has numerous experiences in other forms of testing like API testing that we will probably touch upon as well. Dave is an experienced software tester, he's been working in the industry for over 13 years now. He's worked on scientific computing software that runs advanced mathematical calculations that can scale up to compute across thousands of CPUs in parallel. Dave also has been involved in various automation and mineral testing initiatives and currently works at D2L, which is one of the leading online learning platforms and you can imagine what the last year with covid and everything, online learning platforms have been seeing a massive increase in traffic and loads of data. Dave had some awesome real-world experience on how to continue to grow and learn his performance testing strategies in this kind of crazy, crazy world we find ourselves in. So really excited to have Dave join us today. He's created a number of different software testing-related courses, including a course on getting started with performance testing that we will be diving into today. You don't wanna miss this episode, check it out.
Joe Colantonio:Â [00:01:22]Â This episode was brought to you by SmartBear. Listen, load testing is tough. Investing in the right tools to automate tests, identify bottlenecks, and resolve issues quickly could save your organization time and money. SmartBear offers a suite of performance tools like LoadNinja, which is a SaaS UI load testing tool, and LoadUI Pro, an API load testing tool to help teams get full visibility into UI and API performance so you can release and recover faster than ever. Give it a shot. It's free and easy to try, head on over to SmartBear.com/solutions/performancetesting to learn more.
Joe Colantonio:Â [00:02:05]Â Hey, Dave, welcome to the Guild.
Dave Westerveld:Â [00:02:09]Â Thanks for having me on, excited to be on here.
Joe Colantonio:Â [00:02:12]Â It's awesome to have you. So Dave, as I was looking through LinkedIn, I usually do just random searches for keywords like performance testing and I was really excited to see your course, I think you published it in 2019, so can we dive into it a little bit today. Before we do, though, Dave, is there anything I missed in your bio that you want the Guild to know more about?
Dave Westerveld:Â [00:02:29]Â No, I think that covers that. I've been testing for quite a while, deep in the industry. I love learning about testing of all sorts like you mentioned. So excited to talk a little bit about performance testing here today.
Joe Colantonio:Â [00:02:41]Â Now, I've been looking at your blog as well, off-beat testing and a lot of the posts are about API testing. So I'm curious to know, a lot of people when they get into testing, all they do is focusing on the UI. And I love how you're focusing more on backend automation and performance testing. So why focus on this type of testing rather than UI testing?
Dave Westerveld:Â [00:02:57]Â Yeah, it's a great question. I think I've started to kind of think of this as myself as being a full-stack tester, talk about full-stack devs or, you know, front end and back end devs and that. But I think a full stack tester is pretty valuable thing. And so I see all these different forms of testing as they're testing. There are specialties within testing, but performance testing, API testing, security testing, automated, test automation, all these different things, I think a well-rounded tester will have some experience in each of them. And so in some ways, I've just wanted to dive into these different things, learned them I've needed to learn them with, for my job. And I've certainly spent a lot of time in the last couple of years really diving into API testing. And I actually finished a book on API testing with Postman a little while ago. So I've kind of, I guess, niched down a little bit in terms of API testing but I think all of these, these forms of testing, there's something to at least knowing the basics of these different kinds of testing so that when you come across it when you see the need for it, you've got the ability to at least contribute to it and be a part of the conversation on those things.
Joe Colantonio:Â [00:04:03]Â Absolutely. And I love this concept to the full-stack tester. So another thing I've been seeing about is that there's been a lot of talk on LinkedIn. I don't know if you know Paul Bruce, he used to work for SmartBear, but he talked about how he doesn't like the term nonfunctional testing because it kind of minimizes these things as nice to have, but not necessary. I think performance testing falls into that bucket. So why do you think people should focus in on other forms of testing, like performance testing?
Dave Westerveld:Â [00:04:26]Â I like that, too. I mean, we do in the industry get called nonfunctional testing, but things like performance testing are crucial to the success of modern software. I mean, I think Google I think in my course I talk about this, with Google has, you know, stats on how quickly people leave a site if it's not performing. And I mean, we even recently on my team, we were working on some new stuff and we had it out and ended up under certain circumstances, not being that performant. And it meant that people couldn't use it like people just couldn't use it. We actually had to turn some features off. So we're you're looking at functional versus non-functional testing. And in that scenario, we actually share functionally the feature worked, but we had to turn the feature off for a few of our clients just so that they would be able to use the other features effectively. And I think that kind of illustrates the importance of is performance testing. I mean, it doesn't matter how well your app does functionally if you just can't interact with it. So performance testing, I think API testing definitely falls in that realm. It's one of those things that's kind of hidden away and you don't realize what's going on. But if you don't have a well-designed API, you can end up causing a lot of problems for your end-users. Yeah, that could be avoided if we maybe focused a little more on these nonfunctional aspects of testing.
Joe Colantonio:Â [00:05:46]Â Cool. So I asked everyone this question, what is performance testing from your point of view? I think sometimes when people get tripped up to take, it's like a performance testing team that has to create this lab with all these servers. And, you know, back in the day, it was like putting a load on the server. It wasn't really from a user frontend per se but now with these job applications coming along, it really has to do with the front end performance as well. So for you and where you're working now, what do you classify as performance testing?
Dave Westerveld:Â [00:06:11]Â Yeah, that's a great question, because every term needs to be defined, right? Everyone has their own interpretation of what different terms mean. I think in software testing we see that. But I think that's just unique to software testing. But when it comes to performance testing, the way I think about it, I mean, at the end of the day, I think a customer focus is the best way to approach it. So what does it feel like for a customer using the site is kind of the core thing that I would try to look at with performance testing? Our company, the company I'm at right now, we have a performance testing team whose job it is to kind of drill into the details and figure out some of the stuff they've got you know, well, lab, I mean, it's everything's in the cloud nowadays, but they've got a lot of expertise in that. And we rely on that team. And I just this week well, last week, I guess, was in a meeting with them, kind of going through understanding some of the performance issues we were having with ours. And so. Using those specialties is great, but I think at a, you know, individual team level or individual tester level, there's a lot of stuff that you can do kind of from the end-user perspective that can give you insight without needing the details of, OK, you know, how does this exactly influence my CPU usage, my memory usage, all these things that are in some ways maybe less important now that many applications are on the cloud and you can kind of scale-up compute resources as needed and you've got other strategies that you can use to deal with that. But using it to try and help yourself understand where those bottlenecks, where are the things that are slowing it down so that my end users of the software are feeling like it's slow.
Joe Colantonio:Â [00:07:51]Â So Dave, you just mentioned something interesting that you know, a lot of people think these resources infinite, but they do cost money when performance gets bad, I guess if you're consuming a lot of it so it's something you need to watch out for. So what are the things you could look for then that aren't related to maybe resource utilization if you feel like you have that pretty much handled?
Dave Westerveld:Â [00:08:07]Â Yeah, that's a good question and it's a good point too, you know, just because you can scale up things, your operations team might not be so happy if you need to start scaling up thousands of dollars worth of computing resources. I think one of the things that you can really look at is that end-user impact. So there are measurements you can do, like you said on the front end, looking at, Okay, what does it look like at even just out of UI level? So when I click on this, how long does it take before the page is fully rendered or how long does it take before the DOM is active right before I can start to interact with the page, it can measure some of those things. And I think and maybe underutilized performance tool is monitoring so we can build those metrics right into your site and see in production that and that's kind of maybe in some ways the holy grail of performance testing. Right. It's really hard to do performance testing and production because if you're stressing your system or giving it a big load, you're going to impact users on the system. So you can't test in production, which then means that a lot of performance testing is done on different test labs and stuff, which is obviously a great compromise. But those test labs don't have the same setup. They don't have the same resources as your production system does. So you've got to do a lot of extrapolation to figure out if results from your test lab are relevant in production. So if you can get things and we've done this actually on our team where we have some performance metrics built right into our telemetry so that when a user does certain actions, we can see how long it takes for that action to kind of resolve them to be able to interact with the page again. And so we can see in production how things are performing. Obviously, there's a lot of challenges with that, too, right. Because if a user is working on a mobile network or if they're, you know, far away from things are, you know, in the forest or something, there's going to be slowdown to happen just due to network speeds and a lot of other factors that we don't have control over. But I think that's a very interesting thing to think about when it comes to performance testing and the ability to kind of test in production.
Joe Colantonio:Â [00:10:15]Â A few questions keep popping up in different areas so hopefully, I don't feel like I'm being scatterbrained here, but I want to dive into the a little bit the outliers. But before I get to that is I'm just curious to know, did you start this initiative from because it sounds like you said, your performance team, you're part of a team that is probably a development team. You're a tester who started this push for performance from a user perspective, was it you taking the initiative? Was it from the top down saying, hey, testers, you need to get more involved in performance? How did this become part of what almost seems like a cool culture that you have going on there?
Dave Westerveld:Â [00:10:45]Â Yeah, I think it's a bit of a mix, actually. So there was a recognition that we were having some performance issues, this was even before covid, you know, a couple of years ago. So we have a broad range of clients. Some of our clients have millions of users, millions of students. Right. Some of them have a couple of thousand. And so you got this broad range. So some of our, you know, the higher-end clients are heavier users, user account clients. We're finding that certain things were not that performant. So I think at that time we kind of got OK, let's get a couple of people to do a performance testing team that can kind of dig into some of this stuff and understand what's going on, you know, specialize in those details. But then we realized that you can't just throw one team at a problem and expect it to go away. It's got to be a cultural change across the company. Having a performance testing team that people may or may not know is there may or may not be able to use. You got the silos that can happen with some of that when you've got a separate team. And so we started thinking about that in terms of the telemetry stuff that I was talking about. That was an initiative that kind of came from our team. I'm not going to say myself only I was one of the people saying, let's do this. But certainly, just as a team, we were thinking about how can we better understand what's going on? Because we had the client that was saying, hey this isn't working for me, it's not like it's too slow and we're looking at it like, oh, it's fine on our test sites, like what's going on? How are you using it differently? So we're trying to understand what they were doing. So let's put some telemetry in so we can understand a little bit their workflow a bit better. And then while we're doing this, let's also measure the performance. Then we can actually see, OK, you've got slowdowns at certain times of the day. Maybe when you've got high usage, you know, where a lot of people are using the system at the same time. Or maybe we can understand across your monthly cycle rate. So in educational institutions, which are our clients, there are certain cycles to when you grade different users and things like that. So trying to understand this stuff, was a bit of a mix. There was, you know, a recognition that things were getting in sometimes some cases escalated up to leadership and leadership's like, OK, we need to do something about this. But then there was also at a local team level of us saying we want to solve this problem, too. Right. We're here to make our clients happy. And to do that, we need to understand where their pain points are as best we can.
Joe Colantonio:Â [00:13:10]Â I guess what adds extra complication in your particular situation is if you have a normal SaaS application, it's usually, you know what the flow is going to be. You know how users are going to use it on your platform for health care companies in different hospitals, implement the software differently. And it sounds like that's the case with yours as well. So how do you know when something like just an outlier or it's actually an issue with your software? No, I'm saying like, how do you know? Like, you have all these customers, all different user cases. Even if you tested in production, you never find this issue because it's so unique to that customer, I would assume.
Dave Westerveld:Â [00:13:41]Â Yeah, it is. There's actually a fair bit of similarity between education and health care. I mean, obviously in health care, maybe the outcomes are a little more immediately important, right? If you mess up someone that's in education, you're not, they're not going to die on the spot. But they're somewhat similar in the sense that there's a lot of regulation around it. Most educational institutions are governmental of some sort. So there are certain regulations around how things work there. So obviously not to the same degree as in health care for obvious reasons. But there's a similarity there and there's a lot of different educational institutions that implement things in different ways as you said. So trying to understand the outliers is really difficult. It's one of the challenges I've been in meetings with some of our clients to, where they are concerned about the way something is working. And they're like, why didn't you test for this? Like, well, we do have thousands of clients that all do this in different ways. It's just not possible to test for everything. So, and to some extent, we do have to be a bit reactive. Right. When you see a problem, you react to it. So to some extent, that's just unavoidable. Now, that's not the ideal, maybe the way that we exactly want to go about it. And so we are trying to also have that hybrid approach of understanding ahead of time as well. And I think that's where this performance culture that we were talking about a little earlier comes into play of when you're creating something as a team, we need to think about the fact that there are users that are going to use it in ways that are, I guess, heavy usage ways. Right. So we can't just assume that because it's working functionally with, you know, and we're trying it out on our test site with 50 users or something in a course that's going to be fine. And we've started to realize that a little bit, even just in terms of, say, our definition of done, when is a feature done and ready to be shipped. And we need to have performance testing included in that. We need to have some kind of load testing, understanding that it will scale and work well. And if we don't have that, we have to say, no, it's not ready to ship yet because we just don't know if we're going to run into major problems when we send it out,
Joe Colantonio:Â [00:15:55]Â So now that you're speaking about this area is really not regulated. So it's not like you could just throw a monitor on someone's internal network to see what's going on with the performance. So do you build anything into the system itself, like for logs so you can make it easier to know when there are performance issues to track it down?
Dave Westerveld:Â [00:16:10]Â Yeah, I mean, we do our best, the best that we can there. As you said, there are obviously privacy issues you need to watch out for. Make sure we're not logging be it private information, personal information, GDPR. We operate in Europe as well. So we have to make sure we're compliant with those regulations. So there are some challenges there but we do have some monitoring tools that we can use to drill in to help us understand the workflows. You can't always but it can help sometimes. Yeah, I think it's a really important thing to think about for testing, in general, is how these tools can help us better understand what our clients are doing so that we can find those edge cases, those things that our clients do that we are surprised by.
Joe Colantonio:Â [00:16:52]Â Absolutely. I love focusing on the edge cases, but sometimes people aren't even getting the best case down. So if someone's listening to this, maybe they don't have a huge performance team with them. Are there any questions testers can ask before they can get started on a project I know it's part of DoD, I think that's awesome that they can also have some questions they maybe can ask to help drive them before doing performance testing to guide them.
Joe Colantonio:Â [00:17:16]Â Yeah. So we don't have anything may be explicitly defined in our team. But I think it's good that idea of when you're getting into performance testing. I feel like it's a bit of an overwhelming field for someone who's maybe newer to testing or the idea of performance testing these loads and how do we generate all this stuff and it can get really complicated. And then all these results that we get out and how do we analyze them. But I think for getting started with performance testing, it's a bit of an intimidating thing for testers, for those that are newer to the field or newer to performance testing, there's just a lot of stuff there. It's a big field. And so I think when you're getting started, some of the main things you want to drill in on are just understanding what the impact is on a client's thinking through that aspect of it. So I think when you're getting into the field, you have to realize you're as a tester, you're not going to know much. And that's fine. You're starting something new and you're not going to know much. But you can actually still contribute something with your curiosity, by just asking questions about how is this going to impact the end-user? How is this going to look like? I mean, you have to know your particular context, right. So how is this going to look like for the kinds of clients that we have? So, I mean, in this podcast, I've been talking a lot about educational clients because that's my particular context right now. But if you're working in e-commerce, for example, you're going to ask some different questions. You're going to drill in on some different things. You're going to think about what happens on Black Friday, what happens at Christmas time, what happens when we get these different kinds of scenarios. And I think really bringing that something, I think a lot of testers are good at that customer focus into it. What is this going to feel like for our clients and how are we going to be able to do this? And then also those questions of curiosity. What happens if we're successful? Sometimes we release a feature like Ah, it's a great feature, but we don't think about it, OK, what happens if this feature is actually really successful and people really like it? And all of a sudden we have a whole bunch of users and then we're like, oh, no, it doesn't work when we get one hundred thousand or a million users or something. So just asking some of those kinds of creative probing questions can be a really helpful way to get into the conversation around performance stuff.
Joe Colantonio:Â [00:19:38]Â So besides, once they have performance, they know what they're going to do, they start doing it. There are this numerous types of performance tests that can be done. How do you know when to do one over the other? Like what are the different types of performance testing do? Should you do all of them and maybe talk a little bit into that?
Dave Westerveld:Â [00:19:53]Â Yeah, I think I'll preface this too, with I'm not a performance testing expert. I'm a software testing expert and I've learned performance testing. And so I want to say that because I'm going to talk about some different kinds of performance tests, I may or may not have the same definition that you've heard in other conversations right, for people listening to this. So I'll try to kind of define what I'm talking about as I go along with these different types of tests. But I think there's some helpfulness to maybe hopefully in a conversation like this where someone who is not a deeply embedded expert in the field can still talk a bit about this. And for those that are newer to the field, maybe it makes it a little more accessible. So that's partly why I put together a course on it. Even though I'm not a lifelong expert in performance testing, I just want to share where I'm at, what I've learned along the way to help those who are maybe a step or two further behind in the learning process. So, yeah, I think there's a lot of different kinds of performance testing. And that's one of the things that can be challenging about it, is when you say performance testing, what do you mean? I mean, I've talked about it in terms of what's the end experience for users. But then when you actually drill and how do you test for those different things, how do you actually figure out what's going on? You can do different types of loads on the system. I think that's a helpful way to think about it, is I'm loading up the system, I'm creating some kind of load on the system to help me answer questions about how the system's going to respond when if those loads happen in real life. And so I think about it again, performance testing is testing you design. If you were doing functional testing, you would design your test, then you'd run your test, then you'd look at the results and probably feed that back into your test design, maybe make another test based on those results, or maybe find a bug and need to fix in the application. Same thing with performance testing. You're going to look at what questions do I need to answer with this test? And so if you're in the context, say, of like you talked earlier, e-commerce, where there are certain days that you're going to see, you know, you're probably going to see a massive spike in usage, well, then you might want to make what I would call a spike test. You're going to want to come up with a performance test load, a load profile that spikes up your users really fast. That starts from a certain baseline level of users on your system or calls on your system or whatever and then is going to spike that up quite quickly to levels that are maybe 100 times more than your baseline level over a very short period of time. So if you're trying to answer that question, you may create that kind of load profile. Another thing you could look at is stress testing your system. So you want to take your system and just put it under some stress and find out where the failure point is, at what point does the system stop behaving normally and start behaving in problematic ways. And so with something like that, you could just ramp up the number of calls on your system. You could linearly ramp it up over time and see, OK, where does it fail? OK, that's the point at which I may want to investigate is at an acceptable failure point. How does it fail at that point is another helpful thing you could look at. Right. Does it fail gracefully? Does it cause major problems? Does it break other things? And then you can have endurance testing, I call it. So the idea of just fairly heavy load over a long period of time, I mean, for our company, we saw that when covid hit and here we are 15 months later and we're still at loads that are quite a bit higher than our pre covid levels. So endurance testing, how does your hardware, your system perform over long periods of time under load? And yeah, there's other things, you know, scalability testing. We talked about that with the cloud and if you can kind of throw more resources at it, how well does it scale under those circumstances? Does it respond quickly enough? Do you have stuff in place that will help, you know, that you're suddenly spending a lot more money and maybe see if you need to go in and do something about it? All those kind of different loads that you can generate depends on what question you're trying to answer. Really?
Joe Colantonio:Â [00:23:56]Â Absolutely. There's so much more we can talk about. But that's why you created a course Performance Testing Foundation that can be found on LinkedIn. I noticed there are over twelve thousand learners there. Have you got any questions or any common questions from people taking this course?
Dave Westerveld:Â [00:24:09]Â Often the questions I get are just kind of around some details of the course, I guess. But I think one of the things that I've had heard feedback on from that course that people have appreciated about it is that it uses Jmeter in the course, you know, obviously a well-known performance testing tool. But I really don't focus the course on the tool to looking at these principles, the things we talked about here today. But also I talk about statistics because, again, if you're going to get into this stuff to understand it, you need to know just a little bit about statistics. So talk just a little bit about some of that stuff. I've certainly had people ask questions about how to put it into practice in your day-to-day job. But that's you know, that is something in all these different contexts change a little bit exactly how you put it in. But that's what I want to do, is kind of lay OK, here's the foundation for how we can think about performance testing. And honestly, I think learning the different tools is in some ways, I'm not going to say the easier part because it can be quite challenging to learn those tools. But if you've got that foundation there, you should be able to apply it. If you want to use Blazemeter great. If you want to use Jmeter, great. If you want to use whatever other tool SmartBear tools, great. You can figure out those tools and apply the same principles from one tool to the next.
Joe Colantonio:Â [00:25:25]Â OK, before we go, is there one piece of actual advice you can give to someone to help them with their performance testing efforts? I think you just gave one, but maybe another one. And what's the best way to find or contact you?
Dave Westerveld:Â [00:25:35]Â if you're trying to learn anything? So this isn't particularly about performance testing, but if you're trying to learn something, you've got to do it. You have to try it. You can go check out my course or other courses on the Internet or read books. And those are great things. But you've got to actually do some performance testing, whether that's finding a test app somewhere that you can use, pulling up a different tool, trying it out, seeing what you can do, what you can learn about it. But I think if you really want to learn something about it, you have to just actually do it. You'll find out you don't know a lot of stuff and then circle back to the resources. Look at them with fresh eyes. OK, I was trying to do this. How do I do that? If you circle back, look at the resources. And so I think having this learning loop of reading, studying, learning, listening to podcasts, reading books, taking courses, but actually practically applying it because I can read a book, it sounds great. And then I actually try to do it. I'm like, oh, I don't understand this yet. So I think that's a really important thing, is actually getting your hands dirty, doing that work of performance testing or trying it out in some way
Joe Colantonio:Â [00:26:42]Â Dave, the best way to find or contact you?
Dave Westerveld:Â [00:26:44]Â Yeah, the best way to find or contact me. You can find me on LinkedIn, just search for Dave Westerveld, I'm sure I'll come up. I have my blog offbeattesting.com and I have the same handle on Twitter as well, @Offbeattesting if you want to reach out to me there.
Joe Colantonio:Â [00:27:00]Â Thanks again for your performance testing awesomeness. If you missed anything of value we covered in this episode, head on over to TestGuild.com/p66, and while there, make sure to click on the Try Them Both Today link under the Exclusive Sponsors section to learn more about SmartBear too awesome performance test tool solutions, LoadNinja and Load UI Pro. And if the show has helped you in any way, why not rate and review it on iTunes, reviews really do matter in the rankings of the show and I read each and every one of them. So that's it for this episode of the TestGuild Performance Site Reliability podcast. I'm Joe. My mission is to help you succeed with creating end-to-end, full-stack performance testing awesomeness. As always, test everything and keep the good. Cheers.
Outro:Â [00:27:45]Â Thanks for listening to the Test Guild Performance and Site Reliability podcast head on over to TestGuild.com for full show notes, amazing blog articles, and online testing conferences. Don't forget to subscribe to the Guild to continue your testing journey.
Rate and Review TestGuild Performance Podcast
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.