Testing our Performance Test Tools with Nicole van der Hoeven

By Test Guild
  • Share:
Join the Guild for FREE
Nicole van der Hoeven

About this Episode:

Test tools make it easier for us to spot defects in our applications, but who tests the test tools themselves? In this episode, Nicole van der Hoeven, a performance testing expert and Developer Advocate at k6.io, will share how she chooses the right performance tool for the job. Discover how to avoid the “best” tool syndrome and instead think in terms of having a Testers' Toolbox to select the right one for the job. Listen up!

TestGuild Performance Exclusive Sponsor

SmartBear is dedicated to helping you release great software, faster, so they made two great tools. Automate your UI performance testing with LoadNinja and ensure your API performance with LoadUI Pro. Try them both today.

About Nicole van der Hoeven

Nicole is a performance engineer with ten years' experience in breaking software and learning to build it back up again. She has worked as a performance testing consultant in Australia and Europe, and she's spent the last few years helping teams all over the world scale up their load tests on the cloud. She currently lives in Maastricht, the Netherlands, and works remotely as a Developer Advocate at k6.io. Her favorite part about load testing is its interdisciplinary nature, and enjoys drawing from seemingly disparate fields like economics, video production, linguistics, and productivity to reframe load testing concepts.

Connect with Nicole van der Hoeven

Full Transcript

Nicole Van der Hoeven

Joe [00:01:58] Hey Nicole! Welcome to the Guild.

Nicole [00:02:01] Thank you.

Joe [00:02:02] So it's awesome to have you. Nicole, is there anything I missed in your bio that I really botched that you want the Guild to know more about?

Nicole [00:02:09] Not that you botched it, but I just wanted to say I don't think I'm very smart. I think what I am is very stubborn. So I just on a personal note, I'm Filipino and I'm a woman and I did economics, not computer science. So I just wanted to do a shout-out to everyone out there with nontraditional backgrounds that we're stubborn and somehow manage to find their way into tech.

Joe [00:02:35] Absolutely. I didn't have a degree. I went to three semesters at Berklee College of Music for guitar. My friend gave me questions for an interview back in the day when it was like before the tech bubble. So I actually got in knowing nothing about computers, so I had to agree with that. I don't know if I'm stubborn. I just was desperate is more or what I was like. But what got you into computers then?

Nicole [00:02:55] Actually, I found I lucked out. I found a mentor. You probably have heard of him. Stijn Schepers.

Joe [00:03:02] Well, nice.

Nicole [00:03:03] Yes, he was my first boss in IT and he taught me everything from scratch, actually.

Nicole [00:03:10] Nice. So what drew you to performance testing of all the things you could have gone into IT? I know your mentor is a performance engineer expert as well but is there anything specific that you said, “Oh, not only did I try this, but I also enjoy of all the things in IT, I could have got into.

Nicole [00:03:24] So I was always interested in tech. My brother is a developer and my father was an engineer. So it was a very techie family, which, you know, is kind of uncommon in the Philippines. But I knew very early on I didn't want to do pure development because I love people and I feel like testing is a perfect bridge like you have to be able to talk to developers and speak their language. But also with the business, you have to keep end-users in mind, like it was just a hodgepodge of different disciplines. And I like the mixture.

Joe [00:04:00] I don't mean to keep harping on the linguistics, but I'm just fascinated by anyone that can speak more than one language. I just remember sitting in front of my grandparents. I have no idea what they're saying. None of my grandparents are relatives. So what even motivated you to learn more than one language and how does that relate maybe to performance testing or tech, how it may be help you with your perspective?

Nicole [00:04:19] Yeah, I think it's actually really related. Well, I can't really take credit for learning English. That was kind of something that was decided for me. In the Philippines, you really need to speak English. Everybody does at school. And so that was sort of something that I got for free. But I really took to it. And of course, it helps that I have an Australian husband and we've been together for, you know, seventeen years. We've known each other for 17 years at least. So but I mean, languages are languages, right? And so in performance testing, maybe you don't get the in-depth knowledge that developers would. But I think it's way more important that you are exposed to a breadth of technologies and protocols and languages. And that's what I love about it. I may not be able to speak everything natively, but I know enough of a lot of languages and I love people. So I feel like every new language exposes, opens the door to a lot of other people that I can talk to and might actually listen to me.

Joe [00:05:28] Absolutely. Yeah, I love that. And I love how you mentioned earning all these different languages gives you an idea of what's out there and different perspectives. So it kind of relates to test tooling in the sense of there's a lot of test tools out there that people can learn and spend time on. How do you determine what tools you should be focusing your time on of your limited amount of time that you have?

Nicole [00:05:49] I try to look at all tools, at least briefly. But you're right, we're limited by time. There just isn't enough time to go to know every single tool on the market incredibly well. So I honestly let the market decide a little bit. There are some tools that keep getting mentioned for open source tools. Stars on the GitHub page are sometimes a good indication of that. And just I try to look at the feature lists as well. I think that as a load tester, it's really important to have your own sort of framework for remembering all of these tools, because I don't know about you, but my memory has been not great. I'm finding as I get older, not that old yet, but I still forget things that I knew really, really well not too long ago. And so I think it's really important to take notes and have some sort of list of features that you look at and you can say, “Okay, this tool does that. This other tool doesn't do that, but it does do this other thing.” And that way you have it in your head.

Joe [00:06:56] Absolutely. I think testing tools, people get themselves maybe into trouble because you could put a massive amount of stress and load on a machine that isn't realistic and cause a lot of fire saying, “Oh, our application is unstable.” So when you come to choosing a tool like how do you know what kind of performance you should start with, maybe when learning a tool or what do you look for maybe in a performance test tool that for newbies?

Nicole [00:07:18] Well, the first thing is how usable is it? And we always look at the end-user experience in testing, right? But when we're looking at test tools, we are the end-users, but we never think about testing the test tools themselves. We should have requirements. And that's what I was sort of getting at with the framework. We should have a set of requirements and assess the tools based on those. And I think that we like to think that we're tool-agnostic, we load testers, but are we really? Because I think load testers are just as likely as anyone to stick to things that they already know. And once you start saying, “Well, that's what I've always used,” you know, that's a problem, right? That's that should be a huge red flag. I think that there are things that we don't look at for test tools. Things like, “What is the CPU utilization of a test tool when you're running it or memory utilization? Or how easy is it for a developer to use or a tester to use?” We kind of just take it for granted and use them to test our applications, but we don't actually test the very beginning of that chain.

Joe [00:08:32] So would you recommend them to test the beginning of that chain? Did you like a proof of concept? You run it through and then you have criteria that you go, “Okay, yes, pass, pass. pass.” You can proceed to the tool we'll officially use within an organization.

Nicole [00:08:46] Yeah. So I like to have one site that I always use and one sort of business process that kind of I always script in every tool that I want to spend time on. And the reason that I try to stick to one site is that you can then compare one script with one tool to another directly. That's a direct comparison, because if you test different sites all the time, well, you're testing you're comparing apples and oranges so I always start with a proof of concept. I always look at things like what sort of parameterization options does it have? Does it do dynamic correlation? Does it accept test data? So I have a list of my requirements and I think every load tester should have their own requirements.

Joe [00:09:33] Awesome. I think you actually do have a list of tools on the market that you do an analysis of and then you do a comparison of them. So I guess obviously it all depends on your company though. If you say if Nicole says this is Nicole approved test tool, performance tool, will that be the right tool for me like a banking industry or something? Does it matter? Are all these tools applicable to all situations or all teams different?

Nicole [00:09:56] Yeah, absolutely. All teams are different. All industries are different. Even the testers, the specific testers and developers, and engineers that are involved are different. They all have an effect on which load testing tool should be used. So I definitely don't think that anyone can ever say there's one tool that's the best. And I think anyone who says that is looking to sell you that tool. I really think that you should look at your situation, kind of approach it more as a toolbox. Testers should have a toolbox of different tools. You wouldn't use a hammer when you need a screwdriver. So I think that that's the best approach.

Joe [00:10:38] I said I wish I had a red flag I can throw up because I'm guilty of always going, “You should use LoadRunner.” And the reason why I say that is I started off I use the LoadRunner 20 years ago. Over 20 years, oh my gosh. Anyway, and I liked it because I had everything built into it. So it has monitoring. You can script in it. It has reports. It's like end to end and from my experience with open source tools and a lot of times you need to cobble together all these open-source solutions to create one end-to-end type of performance solution. Is that the case? Am I doing it wrong or do you have any recommendations for that?

Nicole [00:11:07] I wouldn't say you're doing it wrong. I mean, I still use LoadRunner quite a bit and I actually learned load testing using NeoLoad. So I'm not against proprietary tools at all. It's just that I think that it appeals to a certain maybe market segment or you need let's face it, they cost money. And if you are in a big company where money is not an issue or they already have paid for these tools, then I think there's a lot of value in having the entire ecosystem already there. I think the appeal of LoadRunner is not just LoadRunner itself, but the fact that you can tie it into performance center and even LoadRunner analysis as opposed to VuGen. I think that it's the whole set system that you're paying for, right, but I personally have tended towards open source tools because that's where I see the innovation happening these days. I think it's way easier to push a commit to GitHub than to package up a very nice, polished standalone application and deploy it to all your customers and get them to download it. So there are some very technical reasons why open source projects just develop faster. And that's not even taking into account the fact that there's a community out there that can contribute to it, too.

Joe [00:12:32] So speaking about open source solutions, I would think the first one people gravitate towards or think of as JMeter, and I think you probably originally started with JMeter as well. And then you went over to k6, which I like to dive into. But first, any thoughts on JMeter and then why someone may want to start with JMeter, and then I'd like to probably against k6 to give people an idea what the differences might be.

Nicole [00:12:55] Sure. I've actually done a video on this and an article on this because it is something that I've been thinking about too. I love JMeter. I want to correct you a little bit and say I haven't moved to k6 because I am still looking at JMeter. I have several versions of it installed and I'll install the next one as well. And I think that just because you try out a different tool, that doesn't mean all other tools are dead to you.  But there are a few things about JMeter that was starting to wear thin with me. And one of them is actually the thing that drove me to it in the first place. And probably it's the same for a lot of people. And it's the UI. I think that we have this notion that if something has a UI, it's easier to get started with because you can just sort of mess around and look at all the menus. And maybe that's true at the start. But I was finding that it was causing a lot of problems later on. I don't know if you've ever also used it for load testing when you were collaborating with someone else, but it is a pain to like version control. If you're trying to collaborate with somebody else who's also scripting in the same test plan, then you either both have to be looking at the XML, which is so unwieldy, or you have to make sure that you're on the same version of Java, the same version of JMeter, the same version of the plugins, the same plugins. If one of those things is not the same, then you can't even work together. It's not a tool that I think is the best for continuous testing or collaborative testing.

Joe [00:14:34] Great. So I like how you said, you're not JMeter isn't dead to me. I guess it's an Italian thing. “Oh, you're dead to me.” But JMeter is not dead to you. So you have both of them. So when do you know when to choose one over the other? Can you give some reasons why it may be hard to use JMeter for certain situations? It seems like as people if you have a company that's shifting left and they're trying to get the developers more involved in performance testing, then maybe k6, a tool like k6 might be a better option. Thoughts around that?

Nicole [00:14:59] Yeah, so one good reason to use JMeter is if you need distributed load testing right out of the box for free because I don't know if there are any other tools that do that. JMeter has that built-in for no cost. And yeah, you do have to provide your own infrastructure. That's still pretty amazing. If you need that, then go for JMeter. And I think it's also important to think about the developers. And I know it's weird because this is a testing tool. But a testing tool shouldn't just be used by testers. Everybody should be involved in writing and contributing to tests, whether it's functional or nonfunctional. And I think one really good reason for using k6 versus JMeter is the developer experience. It's really attractive for developers because it's written, their scripting language is JavaScript. So many developers already know how to code in JavaScript, so they're much more likely to contribute tests or write their unit tests earlier on in the cycle. So I find that k6 is much better for more modern Agile continuous development and testing type frameworks.

Joe [00:16:18] Great. You did mention k6 is JavaScript. Do you see any limitations between languages? I know some use Java. Some use Go. Some are just a record and playback.

Nicole [00:16:25] Well, I think that JMeter I would say is still primarily GUI-based, so you don't actually have to code to use it. What I find more limiting is what it runs on because it still runs on Java. And k6, for instance, it runs on Go under the hood. So the scripting language for k6 is JavaScript, but it uses Go and that actually makes a big difference. I know one of your previous guests. I love the episode with Stephane Landelle from Gatling, because he was talking about how important it is to test your test tools and to think about how about breaking past that one thread is one virtual user paradigm. So to step back a little bit with a lot of load testing tools, including JMeter, one virtual user is generally simulated using one OS kernel thread. And I think that that's an outdated approach. And Stephane Landelle points that out as well because it's really inefficient. So when you have a sleep in a virtual user and you normally want some sort of think time or pacing, so that's very common, then that thread just does nothing and it's just a waste of resources. So do you actually play board games?

Joe [00:17:47] I don't know.

Nicole [00:17:49] I do. I love board games. I have several hundred.

Joe [00:17:53] It doesn't surprise me.

Nicole [00:17:55] One of my favorite sort of categories of board games is worker placement. And basically, that just means, I promise, this is related to load testing. So basically that just means games where you assign workers to different tasks. So you might assign one to be a farmer, another one to take care of cattle. And I was looking at strategies for doing well in these types of board games. And I think it really relates to the performance of applications because the first thing that you want to do is get more workers. So you spend a worker to train more workers. And I think it's the same with application performance. The first thing that people look to do is multi-threading, parallelism, increasing the number of threads, and then when you run out of those, you increase the number, of course, but that's physically bounded. And that's the problem with the one thread is to one virtual user paradigm. There's going to be a limit. And what do you do after that? So I think that the Gatling way is a significant improvement because it's message-based architecture. So it's not dependent on how many virtual users you're simulating. They basically use a small number of kernel threads to generate messages, not users, because it's really the load that's important. It's not the number of virtual users can be a bit arbitrary. So I do think that that's an improvement. But I still think that Java as a language has a lot of performance issues. For instance, I think that Java-based tools have the JVM to contend with. So when you have the Java virtual machine there, that adds a management overhead. So you don't just have to think about tuning your application, tuning your test tool, but you also have to tune the JVM because I think it's like a rite of passage for all JMeter users at this point. You had to have already run a test and gotten the error like keep memory exceeded and you have to go back in the configuration files and change that. That's not something that you have to do in a language like Go that's compiled already and doesn't need to be interpreted. So it runs directly on the native hardware.

Joe [00:20:20] How much experience someone needs to know, though, when choosing a tool to think about these types of situations? You give really good reasons why but I don't know if the typical person starting was thinking about threads or resources that way.

Nicole [00:20:32] Yeah, I don't think that you really need to know the reason why I'm just sharing what I know just because I wondered why and I was curious. But I think where it starts, at least where it started for me was after a few thousand users, sometimes even 1000 users, depending on the exact scenario that you're running with JMeter, I was seeing my tests kind of reach a bottleneck, not even a bottleneck that's related to the application. It was just with JMeter. And I don't think you necessarily have to know why that is or the differences between the languages, but you have to do some tests to know that it's there.

Joe [00:21:13] Absolutely. So switch gears a little bit. I think you started a new gig at k6, maybe a few months ago now. Maybe. I know time flies for me as I get older. So it could be half a year, for another year. But I know as you were making that transition, though, you actually were learning and you were blogging about it like every week. What you learned at k6. Can you just talk a little bit about that process because I think it's actually helpful for people going through a similar type of transition?

Nicole [00:21:37] Sure it was actually about three months ago now, and it seems crazy to think of it, but it was actually every day that I was I was blogging about it. So k6, when you joined the team, everybody not just testers, developers, even if you're a manager, everyone does what's called a week of testing. It's meant to be completely internal, so it's meant to be a week where nobody actually helps you or shows you the ropes because they're trying to recreate the experience of a brand new user. And so I loved that idea. I think more companies should do it. And the whole point of it is at the end, you're supposed to tell everybody what you thought and what you had issues with. And I thought that one, it would be a good way to kind of impress my new employer, and two, I thought that it could be a good way to get some content out there about how to use k6 for someone who is completely new. I mean, I maybe wasn't starting completely from scratch because I think other knowledge of other load testing tools transfers to a certain degree to others. But I thought that it was really helpful to kind of expose all of the issues that I encountered. And I did encounter some and to k6's credit, they let me publish it without censoring me. But there were things that I pointed out that I didn't like or that I couldn't figure out how to do. But in general, I was actually pretty surprised at how much I was able to do.

Joe [00:23:14] Yeah, I love it. I think you actually said they wanted you to be honest. So do they take the information for all these new employees and then help to modify their application? I think it's a great idea if that's what they do.

Nicole [00:23:23] Yeah. Yeah, they do. And it kind of depends on the person. Like, we had a developer who wanted to focus just on the Windows user experience, which was great because we don't have a lot of Windows users. And then we have someone like me who was already coming with a lot of biases from other load testing tools. And they loved that, too. They wanted me to compare it to JMeter in particular because that's probably the one that I'm most familiar with. And I think it's just a very open culture. They're very open to listening and taking notes and incorporating it. And actually, a few of my suggestions have already been worked on.

Joe [00:24:04] Awesome. So we talked about all the different tools. But I want to know from your experience that you've used multiple tools. Are there any, like first principles that you think every performance engineer should know or have in their toolkit, regardless of what they're using?

Nicole [00:24:18] Yeah, I mean, we're talking a lot about tools right now. But actually, I think that the most important part is not even about the tool. I think you should be not just you, but your team and everybody that's involved, all the stakeholders are absolutely clear why you're testing because you might think that your purpose is to make it faster. But maybe the business thinks that it's to prevent people from going to social media and complaining about them. And yes, one could lead to the other. Slowness has typically led to people to very bad publicity on social media. But it's a subtle shift because it's not just about making it faster then. It's not just about response times. It's also maybe about front-end performance things. And then even after that, the workload modeling is so important because even if you can do exactly what you're thinking on the load testing tool that you're using, I think that there's still a chance that you get it wrong. And maybe there are some irrational behaviors that real users do that you didn't account for that will cause the application to fail.

Joe [00:25:31] Okay, Nicole, before we go, is there one piece of actionable advice you can give to someone to help them with their performance testing efforts? And what's the best way to find or contact you?

Nicole [00:25:39] Sure. I think this is a bit of an unusual piece of advice. But I would say take notes. I think test your test tools and take notes on what you find because you will forget and you will have biases later that you'll need to keep and check. I think part of being a load tester, one of the best parts is being able to continually try new tools and come up with your own requirements as a user for once. And that's the only way that you'll know that you're choosing the right tool for the job and that you're improving. Oh, and the best way to contact me, probably Twitter. Oh, I guess you'll put a link so I don't need to spell it out, but I'm n_vanderhoeven and I actually just started a weekly live stream of k6 office hours over on the k6 Channel on YouTube with my coworker Simme, whom you've also had on the show. I like questions. I'm new to live streaming. But what we can learn together and I'll answer any questions if I can.

 

Rate and Review TestGuild Performance Podcast

Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

99 Testing Resources, Automation Certifications, DORA Report and More TGNS139

Posted on 10/28/2024

About This Episode: What free certification can make you stand out in a ...

Attachment Details Paul Grossman Halloween TestGuild Automation Feature

The Headless Tester (Halloween Special 2024) with Paul Grossman

Posted on 10/27/2024

About This Episode: Welcome to a special Halloween edition of the TestGuild Automation ...

Naveen Krishnan TestGuild DevOps Toolchain

Exploring AI and Cloud with Microsoft’s Naveen Krishnan

Posted on 10/23/2024

About this DevOps Toolchain Episode: Today, we have an exciting episode for you. ...