Performance Testing Using Locust.io with Rahul Solanki

By Test Guild
  • Share:
Join the Guild for FREE
Rahul Solanki | TestGuild

About this Episode:

If you're into Python and performance testing, you won’t want to miss this episode. Rahul Solanki, a technical leader at BlueConch Technologies, will share his experience using the performance testing tool Locust.io. Discover the benefits of using Locust and a proven approach and framework to help get you started. Listen up!

TestGuild Performance Exclusive Sponsor

SmartBear is dedicated to helping you release great software, faster, so they made two great tools. Automate your UI performance testing with LoadNinja and ensure your API performance with LoadUI Pro. Try them both today.

About Rahul Solanki

Rahul Solanki Headshot

Rahul has more than 10 years of experience in performance testing and engineering and is currently a Technical Leader – QA Automation at BlueConch technologies. His roles and responsibilities are providing end-to-end performance engineering solutions which include performance testing, monitoring, and bottleneck analysis. Rahul has tested various enterprise solutions and has done performance engineering in various domains including healthcare, retail, insurance, and gaming. He has used various different performance testing tools like Locust.io, JMeter, and LoadRunner.

Connect with Rahul Solanki

Full Transcript Rahul Solanki

Joe [00:01:52] Hey Rahul! Welcome to the Guild.

Rahul [00:01:55] Hello, Joe. Thank you so much for having me here.

Joe [00:01:58] Great to have you.

Rahul [00:01:58] It's great. Yeah.

Joe [00:01:59] Really excited to have especially about this topic. I was really excited when someone from your team contacted me about talking more about Locust.io. Before we get into it Rahul, is there anything I missed in your bio that you want the Guild to know more about?

Rahul [00:02:11] I think you covered pretty much everything that I'm working for BlueConch Technologies. And at BlueConch Technologies, we provide vendors with tools, productized their ideas, build commercial off-the-shelf, software products, and transform legacy products. So that's about a quick introduction to the company.

Joe [00:02:26] Perfect. So I assumed working in your company, you need to have experience with other performance tools like JMeter and LoadRunner. So how did you come upon or stumble upon Locust.io? And I guess maybe we should dive into what is Locust.io in general.

Rahul [00:02:39] So yeah. Thank you so much for the question, Joe. JMeter was the open-source solution that we were using so far. So mostly a number of plans that refer to open source solution. So JMeter was good for us until a certain limit of time wherein we had a requirement to execute a crazy amount of load like around 2500 requests per second. So that's when we use JMeter for that test. But we had a few challenges like we had to set up an infrastructure wherein we had to have a number of load generator machines to create a master-slave set up for that. So a number of slaves were set for that particular test. So we try to identify if we could have avoided this situation. So we had to do this because the resources, the test results that we were getting out of that particular configuration, it was a little bit like we were not able to rely on it. The test results were inconsistent. We were getting a number of timeout issues. And when we observed on the server-side at that time, we saw that the requests that were coming into the server were really low compared to what we were sending over there. So this is how we identify the client is facing some bottleneck. Also the resources on the client where we try to identify what's going wrong over there. So we saw that the CPU memory, those were really high because of which we were getting issues on the client-side. So we tried to Google a few options wherein we could identify other tools they're using, which we could test this better so that when we stumble upon Locust.io. So Locust.io essentially is a Python-based tool that basically executes on gevent coroutine, unlike what JMeter is based upon, which is a trade-based architecture. So since it's thread-based architecture, it ends up occupying a lot of resources. So when you compare it with JMeter, the number of resources that Locust occupies is like 70 percent less. So we're really excited to understand what the capabilities of Locust were. So that's how we started exploiting it. We configured a few of our tests using Locust and observed that it was really easy to execute our distributed test using Locust even on a low number of slaves. So a similar kind of master-slave setup is required in Locust but the resource utilization that is on each slave is comparatively very low. When you compare it with other tools like JMeter, other open-source tools. So that's the reason why we try to go forward with Locust since it's a Python-based application, Python being a very strong scripting language. So we saw a number of different advantages, scripting-wise, and we were able to easily integrate it with version control tools like Git. And that's how we tried to go ahead with Locust and created a wrapper over the existing framework of Locust to suffice our requirement.

Joe [00:05:24] Nice. So I didn't know it was actually based on a different architecture. I just thought the benefit of it was Python. So it sounds like JMeter has been around for a while, but you're saying at scale when you're trying to scale it because it uses a thread model that it's harder to scale up with a bigger performance test?

Rahul [00:05:38] Locust basically has a completely different user simulation model which is based on events and asynchronous approach using gevent coroutine. So what is gevent coroutine? It's a Python-based networking library and it uses greenlets, which are very lightweight coroutine and it enables Locust to execute the performance test using very limited CPU and computer memory resources. So that's the reason why there is a completely different architecture to both of the tools, and that's why it is very easy to execute a high number of user loads using Locust.

Joe [00:06:16] Nice. So another question I think I would have is with JMeter I don't know how much you probably don't need to know a lot of Java to use JMeter. But for Locust, do you need to know a lot of Python in order to use the tool itself for performance testing?

Rahul [00:06:28] Okay, so it depends upon the kind of complexity that you have in your tests. So basically, if you are required to only script the test and execute any particular API or any particular HTTP scenario, it would be very quick and easy. So scripting-wise, if you compare it with JMeter, since it has a UI, you can directly record and go ahead with it. But in Locust, you need to have some basic knowledge of Python. You don't need to be an expert, but you need to have some basic knowledge of Python to go ahead with scripting, but it can be done quickly as well. Based upon your expertise, you can implement whatever scenarios you have expertise in Python.

Joe [00:07:03] Nice. I assume you could do a lot with that because it is Python. You have the full power of Python then all the different libraries you could pull in. There are probably things you could do with it programmatically that you may not be able to do at JMeter.

Rahul [00:07:13] Yes, basically JMeter has a wide area. It has been going on for years and it has a wide number of plug-ins to it and it does almost everything. So Locust being a new tool in the market, it has certain limitations to it. But if you want to overcome those limitations that were in JMeter like there is a customizable ramp up and ramp down model that you can implement in JMeter using different plugins like a different type of thread groups. If you want to implement the same in Locust, you can do that. Again, you need to have some expertise in Python and you need to pull in the libraries from Locust and you need to modify those and go ahead with it. So that is possible. But the only thing better that Locust would do is that it would be able to save your resources. That's the major advantage that you get in Locust.

Joe [00:08:00] Obviously saving resources on your node machine. So I guess it saves you money as well, is that what you're saying?

Rahul [00:08:05] Yes. When you're comparing that to open-source tools, both JMeter and Locust are open source. But to set up a test, which involves like thousands of users or let's say scenario wherein you want to execute a thousand requests per second at that time, you might need to set up a test rig, which is of let's say I'm just giving a random number, let's say 10 different slave node for JMeter. So if you want to execute the same test using Locust, you would probably be good with like three or four nodes. So that would save your infrastructure cost.

Joe [00:08:35] Nice. Now you answer this earlier you said a lot of your clients prefer opensource. Being older, I have experience with LoadRunner. I love LoadRunner. It's an enterprise, full-stack solution, has everything built-in. It's beautiful. So are you using Locust and JMeter just because of the licensing of some vendor tools like LoadRunner or mappers (??) tools maybe? Is it what the clients are asking for? Is that the gist of it?

Rahul [00:08:57] The basic reason is that it is open source and the other is that the customizable factor that you get with the open-source tool. You'll get to play around with the libraries, you'll get to implement whatever custom code you want to build. You can write wrappers on top of that to enable us to have exactly what kind of analytical data we request. So LoadRunner basically gives you a lot of data, but sometimes there is a requirement wherein the stakeholders would request for very specific kind of data so that if you want to pull in that data and create a customized dashboard for that, then that is where open source tools come into play. So that is also one of the reasons why we would select it. Yeah, it definitely saves the cost So people would prefer that.

Joe [00:09:39] So I love having options of many different tools. So this particular tool, what is your approach to learning it? And our approach to exploring what it did and then maybe seeing if there are any gaps or if it was going to work for your situation? Can you tell me a little bit about your approach to maybe trying out new tools or just analyzing different tools and seeing what would probably be the best?

Rahul [00:09:57] So basically, we have analyzed this tool along with the JMeter and LoadRunner the max (??). So LoadRunner due to its licensing issues was not considered so another open-source tool that was available was JMeter and with JMeter we were good to go. It was sufficing most of our requirements. But when you compare, when you want to have a specific test, you want to execute a specific test which has a high number of user load at that time, it was not sufficing our requirement. That is where we started to look into other options. So when I Googled around, I saw that JMeter is mostly compared with Locust these days. So that increased my curiosity about Locust and I tried to identify what advantages Locust brings. So basically the advantage here is that you can write it in Python and Python being a very strong scripting language and we are able to customize the user behavior. Whatever user behavior we want to implement, you can simply write a code for that. I mean, definitely, you need some kind of Python expertise for that, but that's how you can go ahead with it.

Joe [00:11:00] So I love this programmatic piece of it because I think it's able to shift left earlier and give you developers more involved.

Rahul [00:11:06] Yes.

Joe [00:11:06] Do you have a chance to look at any other tools like Gatling or K6 because they use a kind of similar approach? I don't know if you have time to compare this to those. I think they're kind of some little Locust than look to JMeter if I'm correct.

Rahul [00:11:19] I did look at K6, which was a JavaScript, which is Java-based on JavaScript, but just on the fact that Python was involved in Locust and the advantages of object-oriented programing and all the advantages that it brings with Python, we went ahead with Locust.

Joe [00:11:37] So you did also mentioned JMeter has plugins for everything. It has been around forever and ever. And so that's a good thing and a bad thing because it could be over a complex to install JMeter. It's like, what the heck is this? The other thing with Locust, I would think that you probably are missing some functionality that you alluded to so as you were learning, is there any key missing functionality that you think you had to create in order to really benefit from Locust?

Rahul [00:12:00] So in JMeter, basically, it has an internally built plugin that enables you to write data into any database, which is like there is a listener for that. So you can directly go and dump data into any kind of database. So that particular thing was missing for Locust because Locust, as it is, does not provide any kind of way to dump data into the database for access in the future for comparison, for baselining. So that is where I found that need to be implemented there, which we then created a wrapper over it and were able to build that.

Joe [00:12:33] I know a common thing with Selenium people struggle because Selenium is just an API from functional automation that people build a lot of wrappers on top that take advantage of Selenium under the covers, but it builds all those other stuff into it that makes it easier to use. Did your company or yourself have a solution or framework on top of Locust that enhances it for maybe performance engineers that maybe don't want to have to develop their own stuff out of the box like that?

Rahul [00:12:56] Yes. So as a part of the performance engineering vertical, at BCT we have developed a framework. We call it the greatest (??) framework, so it basically provides a dashboard. So on that dashboard, you would be able to have all the kind of analytical data like your response time, the request per second, transactions per second, the error rate, all the baseline data, which compares it with the last test result, the underlying tool that you want to select along with that, any open-source tool that can be changed along with that either JMeter or Locust. Now we give the clients an option whether they want to go ahead with JMeter or with Locust. So JMeter is generally preferred by a client who wants to have the test developed real quick, who want quick test results, and they want to just have a point requirement. And once that's done, that's done. But when there is a requirement of the continuous development cycle, like if you are going for shift ref(??) approach wherein there is going to be frequent execution of tests. At that time, we generally recommend going with Locust. Why?  Because initially, it might take some time to develop those scripts. You might need some Python expertise to understand why it needs to be developed. But when you consider it during the maintenance phase, when you go ahead, the development, the responses, and the API keep changing very frequently. The test data keeps changing very frequently. So in this case, you need to modify your scripts. So when you have to modify a JMeter script you need to do a number of different changes in that Ui and it generally has that XMS script wherein you need to go and do those modifications. And so instead of that, if you have it in a script, in a code, you can de point changes. And especially if you're working in a team, different team members can push those changes into the version control tools. And then it basically very easy when you consider it in shift-left approach now.

Joe [00:14:40] Nice. So I guess when you're working with clients, do you create the initial framework or test, and then they take it over? Or is it just they hire you and you do the performance testing for all the different iterations that they have to do?

Rahul [00:14:51] Yeah, we do the performance testing ourselves as well, or we provide this framework that enables them to execute the test by themselves as well.

Joe [00:15:00] Nice. So it sounds like so far for Locust in the plus column is it saves you money if you have a lot of a big load test and a bunch of different generators, nodes up and running. And it also sounds like it's easier to maintain if you're making a lot of changes. Are there any other ROIs that you found from working with Lucas, from your different projects that you found that using this tool was a big benefit for that particular project?

Rahul [00:15:23] So, yeah, when you consider it from the maintenance perspective, I found it really easy because modifying a piece of code is really easy and it gets reflected everywhere where you're using it. So I would definitely prefer that over UI-based where you need to go and make modifications to every point. So that is a key advantage that I have known here, along with the resource utilization of course.

Joe [00:15:46] I guess because it's Python base, does it easily integrate with the teams like CI/CD Jenkins type environment?

Rahul [00:15:51] Yes. It is very easy to integrate it with Jenkins and get it dockerized and execute the test in containers so that integration is also provided. There are different plugins for that available and we can write wrappers over it to make this functionality work.

Joe [00:16:08] I guess the only not the only but one of the cons, I think maybe I'm wrong is finding information on how to use Locust. It could be an issue with JMeter since it has been around for so long, you probably could Google it. And so do you find that as an issue or a barrier to entry? Because there is not as much information out there as there is for JMeter is?

Rahul [00:16:25] That definitely is a part of the challenge. However, they have provided a lot of notes on their documentation part on their website, so that pretty much covers most of the things that we require.

[00:16:35] I guess my next question before I recommend any tool open source is the community. So how active is the community? Have you ever had to put in a Help Desk ticket to try to get some features added or is it easy to contribute to? Are your thoughts on how active they are as a community themselves?

Joe [00:16:49] So the Locust community is pretty much active. I mean, I haven't posted any questions on my own, but as far as the way I'm following up on the forums, so I see that the community people, are very helpful and they respond quite pretty quickly.

Joe [00:17:04] So I'll ask two more questions. So as an engineer, you've been working for 10 years in performance engineering. One thing that I like, it's kind of silly is people liking the tools they use if they're fun. So as a performance engineer, do you find it fun to use Locust? And so, like, maybe that would engage more people to use it compared to maybe JMeter, which is to me seems more clunky.

Rahul [00:17:23] Yeah. So for me, if you ask me, then I prefer to code so it's easier for me to understand the code and make changes to it. So that's why I would rather prefer Locust.

Joe [00:17:36] The framework you have, how can people get their hands on it, or is it part of your… is it open source? Is it just something that your company provides as part of a service when they are you to do performance testing?

Rahul [00:17:47] It's something that the company provides when they hire us for performance testing. So if someone wants to get their hands on that, then they can contact us AskMeter.

Joe [00:17:56] And I'll have a link to that in the show notes as well. Okay Rahul before we go, is there one piece of actionable advice you can give to someone to help them with their performance testing efforts? And what's the best way to find or contact you?

Rahul [00:18:07] Okay, so if there is something that you need to change you need to identify how you can execute your load on the real-time scenario like there are other production workloads. I mean, definitely, no tool can simulate the exact production workload, but we need to find ways in which you can identify how you can simulate the spikes that are there in the system, especially in two domains like retail domain, wherein there is a huge spike at times, especially in today's coronavirus situation, wherein people are using the online platforms like crazy. The user load has been increasing exponentially. So that is when we as performance engineers need to up the game and try to simulate actual user scenarios.

Joe [00:18:50] Awesome. And Rahul the best way to find or contact you?

Rahul [00:18:53] It would be by LinkedIn or my Instagram. Yeah, by LinkedIn.

 

Rate and Review TestGuild Performance Podcast

Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Jon Robinson Testguild Automation Feature Guest

The Four Phases of Automation Testing Mastery with Jon Robinson

Posted on 04/21/2024

About This Episode: Welcome to the TestGuild Automation Podcast! I'm your host, Joe ...

Bas Dijkstra Testguild Automation Feature

Expert Take on Playwright, and API Testing with Bas Dijkstra

Posted on 04/14/2024

About This Episode: In today's episode, we are excited to feature the incredible ...

Brittany Greenfield TestGuild DevOps Toolchain

AI-Powered Security Orchestration in DevOps with Brittany Greenfield

Posted on 04/10/2024

About this DevOps Toolchain Episode: In today's episode, AI-Powered Security Orchestration in DevOps, ...