If you're new to performance testing, one of the first concepts you'll need to grasp is what concurrent users are. This may come up, for instance, when a manager comes to you — the performance tester — and ask how many concurrent users your site/application can handle. It sounds straightforward enough, but many people don't really understand what the term “concurrent user” means from a performance testing perspective. Concurrent users are a key performance testing concept.
Concurrent Users are not Simultaneous Users
The best explanation I can offer is that concurrent users are connected to your application and are all requesting work at some regular interval –but not all at once, and not for the same thing. Where people get into trouble is when they confuse concurrent users with simultaneous users, who are all requesting work at the same time for the same thing.
Let's Imagine a Peformance Testing Scenario
If we revisit the Joe's Gas Station scenario that I've used in previous posts — what is throughput – lets images that concurrent users would be all the users at the gas station, regardless of what they are doing. So it would include the people pumping gas, the travelers using the restrooms, the customers ordering coffee, the people waiting to get their car serviced, etc.; basically, the whole gas station ecosystem.
So — say you had two customers using the restroom, five customers ordering coffee, three customers pumping gas and two customers getting their car fixed would equal 12 concurrent users. Since all those users are doing different activities, different resources in your environment will be utilized.
Simultaneous users would be all of the customers who are pumping gas only at a given time. Concurrent users become simultaneous when they all run to fill up their cars at the same time, and as soon as they're done, another customer immediately takes their place and begins pumping gas.
Typically, when you're running a normal load test you want to mimic the realistic activity that is encountered on your application — not just extremely rare conditions, like everyone rushing to the gas station to fill up their cars before a blizzard or hurricane. This is important because you typically want to set up your load tests to emulate realistic scenarios with x number of users who create a defined number of transactions in a specified time period.
Load Test vs Stress Test
Another way to think about this is to compare a load test user to a stress test user. During a load test you try to emulate real-world conditions, and in a stress test, you push the limits on your system to see where it breaks.
It's important that your performance test is running a realistic scenario with the normal amount of users that are on your system doing the normal amount of work. You want to make sure that your concurrent users are creating the number of transactions that are typical in a given time frame.
I've seen many instances where engineers have created false concerns by running performance tests with a few users and no think time, flooding the servers with numbers of transactions in five minutes that you typically wouldn't see in a day.
Issues like these can be avoided by understanding what a concurrent user is in the context of a performance test and how the number of transactions they create shape the results of those tests.
Performance Testing Concepts
Hope this helps you. This is just one of many critical performance testing concepts I will be covering in a future post. For now check out some of my top performance testing podcast episodes with some of the top performance testing experts in the world: