Mobile Performance Testing with Sofía Palamarchuk

By Test Guild
  • Share:
Join the Guild for FREE
Sofía Palamarchuk TestGuild Performance Feature

About this Episode:

Want to know how to start testing mobile performance earlier in the dev cycle? In this episode Sofía Palamarchuk, co-founder and CEO of Apptim, will share her proven mobile performance method. Discover how mobile performance is different than web performance testing, the three components of mobile app performance testing, user experience testing, and more. Listen up!

TestGuild Performance Exclusive Sponsor

SmartBear is dedicated to helping you release great software, faster, so they made two great tools. Automate your UI performance testing with LoadNinja and ensure your API performance with LoadUI Pro. Try them both today.

About Sofía Palamarchuk

Sofía Palamarchuk headshot

Sofía is a Director and Board Member of Abstracta and the co-founder and CEO of Apptim. After beginning her career as a performance engineer, in 2015, she led Abstracta’s expansion to the United States, heading up business development and working with companies like Shutterfly, CA Technologies, and Singularity University. After seeing the challenges that mobile development teams face, in 2019, she embarked on a mission to transform the way global mobile teams create quality apps. When she’s not busy leading a startup, you can find her kitesurfing or supporting projects that promote more women in technology.

Connect with Sofía Palamarchuk

Learn more about the free PAC performance events

Since its beginning, the Neotys Performance Advisory Council aims to promote engagement between various experts from around the world, to create relevant, value-added content sharing between members about several topics on the minds of today’s performance tester. Some of the topics addressed during these councils and virtual summits are DevOps, Shift Left/Right, Test Automation, Blockchain, and Artificial Intelligence.

Full Transcript

Sofia Palamarchuk

Joe [00:02:03] Hey Sofia! Welcome to the Guild.

Sophia [00:02:06] Thank you, Joe. I'm pretty excited to be here. Thanks for the invitation.

Joe [00:02:11] Very cool. So Sofia before we get into it, is there anything in your bio that you want the Guild to know more about?

Sophia [00:02:16] It is pretty good. I do like kitesurfing in the Bay Area in case we have new listeners and they want to join us. And I'm also vegan so I love cooking. And I try to spend as much time as I can with my friends and my family. But, you know, running a startup means that you have to put a lot of effort in the first years, and that's the state we are. But I'm really happy to be here and sharing my knowledge on this topic that I'm very passionate about which is mobile performance.

Joe [00:02:44] Awesome. Yeah, I've had almost 400 interviews. I don't think…I want to mention being into kitesurfing. So it's literally getting a kite and you just…is that? I don't even know what it is.

Sophia [00:02:53] Oh, yeah. It's a great question. I'm happy I'm the first one. Actually, I think this sport became really popular in the last, let's say, five to four or five years ago. And basically, yeah, you have a kite that you need to push yourself in the water. So instead of only surfing where you just need a board, you have both a board and a kite meaning that you actually go faster and you use the wind to push you. So really we go out into the water when it's really windy and then you have different boards, but very exciting sports. A lot of women are also getting to know it more. And that's something that I also enjoy doing in my free time.

Joe [00:03:34] Very cool. So it sounds very scary to me, but it sounds fun for sure. Cool. So Sofia, you know, I like to define things before we dive into it. A lot of times people here like load testing and performance testing, but they don't necessarily maybe associate with mobile performance testing. So how do you explain what mobile performance testing is compared to, say, performance testing on web app? Are they the same thing?

Sophia [00:03:56] Not really, but yeah, the truth is when you talk about performance testing, typically everyone thinks about load testing. Actually, you can sometimes use the same words to talk about load testing. But actually, it's more than that. When we talk about performance testing in general, we can talk about the performance of our back ends of our client apps. And then you can dig deeper into if your app is a web app or mobile app, then what are the differences between these types of systems and the infrastructure below, and what are the type of performance tests that you can run? So it's more than just one area, but typically and historically, I think load testing has got most of the attention and for good reasons, because there were a lot of performance issues in that part in a way of our systems. But in the last years, from my own experience being a performance engineer, myself, a practitioner and a computer engineer, I actually moved towards more of the front end of apps and specifically mobile where, you know, a lot of new people are jumping into the Internet using their mobile phones even before desktop. So we saw that there was a huge need there for tools and ways to helping mobile teams test their apps and being more efficient in that. And we started with mobile performance by thinking about the app on the device and testing on the client-side. So we don't when we refer to mobile performance or we can actually think about mobile app performance, you call it people, when they search for that, is how can improve the user experience and the client-side performance. So it's a bit different from where we are, most of us where we were used to, which is load testing, and stressing our back ends or web testing, which means how to deliver your content faster through the Internet, through typically a  client-server architecture. So mobile has a lot more challenges and I think that's why it is pretty interesting and required to have specific tools and methodologies to test this. But we can get deeper into that later. But basically, there's a difference between the client and server-side. And if your client-side is a mobile native app and is not a web app, there are things that you have to take into consideration when testing performance.

Joe [00:06:18] Great. So we'll definitely dive in. So you just mentioned something about native apps compared to a browser up. So when we're talking about mobile performance testing are we just talking about native apps to like iPhone or Android? Are we talking about even web apps that scale when they're on a browser on a mobile device?

Sophia [00:06:34] Both. When we talk about mobile, it's any content that you will deliver mobile and can have some web apps accessing the website that's mobile-friendly, hopefully. Then you have hybrid apps with actually apps that contain web views inside of the native app. So it's kind of a mixed in a way of both worlds. And then you have the pure native apps that are Android and iOS today in the market. So all of them, there are different things that you can test when it comes to how they perform in mobile. In my experience and lately even my own startup, we've been focusing first on the native apps and testing the performance of them because we saw there was a huge need for testers to access information about how their native apps are behaving. And typically this is information that is available in profiling tools or in other parts of the performance cycle that not every tester has access to or knowledge, so try you know to democratize information is what we're doing right now. And therefore, wherever the truth is that there really are a lot of tools available to know if your website is mobile-friendly or what optimizations you can do to deliver the content faster to your mobile users. So we didn't see that that was a huge issue today. And we actually recommend, when I give some talks about this, new tools that are built by Google and other companies that have already experience in web performance testing. And I will just guide you there. But for native apps, more challenges that come into place and that's where we put our focus right now.

Joe [00:08:15] Nice. So you did mention also a lot of times you're focusing more on the user front end experience, not necessarily a traditional load on the system. So with the lack of tooling or what people may be not familiar with tooling, what kind of tooling should be in place when you're doing mobile type performance testing?

Sophia [00:08:30] Yeah, that's is a great question. And typically, even before getting to the tooling, the main question that I think is as testers we should ask ourselves like what are we trying to find, and why are we trying to solve here? Because again, we have a lot of companies that come to us asking to can we virtualize, help hundreds of users for our mobile app with Apptim, which is one of our first tools. And truly, I go back to them and say, you know, there are so many tools in the market today for load testing, which is what you need, and you can simulate traffic as if it is in your mobile app. And you don't really need a specific tool for that today. But we'll…what we're trying to do is simulate a real-world scenario when you have hundreds or thousands of users accessing your app, you need a load test and there are specific tools for that. Open source, you know, everyone knows about Jmeter and Gatling and many others are out there. So that's specific like if that's your need, then you should use those tools. You can actually complement them with front end performance testing tools and answering your question, what's available today? So let's say I have…I'm testing my app and it's flow. That's what I see as a tester. I'm trying to load or moving from one screen to the other and the experience is really slow. The next question is, okay, I need to find out where and why is it getting slow? And for that, we need to understand that there are three main factors that actually will affect that experience of the user. One is, of course, the server response times back end database, every service that we're using, and how long those requests take to get back to the actual app. Then it's the network, which again, thinking about mobile, we are usually now using our phones outside connecting to LTE, 3G or wi-fi. And also that effect, of course, the user experience and how fast certain data can be load. And then we also have the front end, which is the app itself, and how it uses the resources from the device where it's installed. So if you are thinking about a smartphone, it is a computer, but with less resources available than any PC or personal computer today. So we can actually think about these three main factors first and say, OK, if I'm having a slow experience and while I'm using the app on my phone, then I need to start using tools to understand where the problem might be, what's taking more time. And that's when I think not only the backend, but also testing the front end can help us that. Tools available, as I mentioned before, profiling tools that help you with the bug, how your app is using the resources from the device where it is installed and by resources, I mean your CPU usage, memory usage on the device, power usage, everything that you can think about that is actually shared with other apps. So we have to be conscious about that. Now, we are not going to release an app that it's high consuming in CPU because we're probably going to drain the battery from our users' phone and they're going to be really mad. So these are things that typical developers will use. For Android and iOS you have both native profiling tools provided by Android studio in the case of Android apps and in the case of iOS inside Xcode, which is ZDE is to develop IOS apps you have instruments, which is kind of a module provided by Apple to access this information. So this is typically what you will use. And of course, you can complement this with proxies when you want to capture traffic and understand how that factor impacts the user experience. And it's a lot of investigation and research around like what could be causing this. But typically you will be thinking about both profiling your apps as well as looking at the network to understand at least have the main root cause of where this loan is might be caused and then you go deeper into a specific area.

Joe [00:12:43] So there's a lot there. I think what makes it even more complicated to me is the environment like you said, the network. How do you emulate a network? How do you emulate a certain OS on a certain device if you don't have a lab? What do you need in a place like environment wise in order to set up a proper mobile performance testing environment?

Sophia [00:13:03] Exactly, yeah. That tells us something that is tricky. And I will say today, in general, mobile testing is a practice. If we don't even think the performance has evolved a lot in the past years and all the automation frameworks are available and it's actually getting better, I think with the latest releases from Appium, which is the open-source framework to automate tests in mobile. And you have in that evolution came also with the fact that you can today rend devices in a cloud provider where you can access those devices. And these are real devices located somewhere in the world. And you can in a way configure real or not, but you can configure the network where you want that device to be connected. And usually, I will suggest cause depending on your business needs and one specific call you want to test to give the loads that cause slow as possible, and you have ways to simulate different networks from the devices you're connecting to by looking at the bandwidth and the latency of those networks, which are two factors of the network. That is the ones that when they change, they might affect how fast the data is being transferred. So you can simulate that as you're doing a load test. You simulate virtual users. It can simulate that on real devices, both if you connect to some services in the cloud that they provide, that they provide you with the devices and the ability to simulate network. And also you can do that even locally if you have your own phone and you can plug it into your computer with a proxy and control and change these factors of the network so you can simulate connect it into 3G or 4G or Wi-Fi and see how that also affects the performance of your app.

Joe [00:14:50] Have you ever seen anyone implement crowd testing using performance testing so you can monitor what real users are doing on their devices, just kind of trying to emulate what would happen once it is officially released?

Sophia [00:15:02] It's funny that you ask because there are currently actually working on a partnership with the crowd testing platform so we can use our solution. And they actually saw the same idea. They have thousands of testers in different locations in the world testing locally with their real networks and devices, but they lack the tooling to control this data automatically about how the app is behaving in the performance. So this is something that we're just now looking into as a way to thinking about automating the test, which you can also do. But also what if an actual tester that is located in Asia or another tester there is in the Americas if they can actually…if they're already testing in the local network, but they might not be very savvy with even like profiling tools, but they are testing a lot. Let's capture those data and put that as part of the test that they're doing today, which might be a functional test, and have that view on how that user experience is and how the app is behaving on the devices they are testing.

Joe [00:16:05] So going back to what you said, there are key components to mobile app performance. You mentioned the back end, the network, and the client. If someone has a web app as well, and they also have an app when they're doing a traditional say performance test, should they also incorporate mobile? Because…can mobile have an impact on the back end or can web have an impact on the back end that you may not see if you're just testing a mobile device in isolation for performance issues?

Sophia [00:16:29] Yeah, exactly. I will actually recommend that for web as well. The same happens if you're testing a web app. You can actually, while they're doing your load test, you can be running Selenium scripts or you can do it manually, but you kind of have something the client-side testing the browser and you can be measuring what the response times for the user during the load test. And there are things that will impact both. And I usually say this is kind of advance. First, start with your load test. Focus on the server and what you optimize there. Then you can look at the front end and then you can run both in parallel to see the end to end experience. So the same for web application mobile. You might want to be running a load test simulating mobile users and then having a way to run a test on the front end and capture everything that's happening there and comparing results as well. There are some things that can be correlated, all the things that are completely separate. But I think there is a use case there as well to an end to end performance test covering both your back end and the front end.

Joe [00:17:40] So Sofia who normally does the performance testing on mobile devices? It sounds like if it's native, it's already in the developer's IDE for things like CPU and memory consumption. Are they responsible or do you still need a tester or a performance engineer involved in that case?

Sophia [00:17:56] Yeah. Our experience is mostly this is something the performance engineers are getting into more and more. So you have companies that have a centralized team of performance engineers and they might even have a specific team focus on end performance, which are the ones that test both the websites. And they are getting into mobile and they are also accessing profiling tools that were used today. That's the ones we see that have an actual structure or performance testing. Then the ones that are developed of course it really depends on how much time basically a developer has and how important it is for them, because if they have any issue that they need to solve. But typically the most successful teams that we've worked with, they have dedicated person that is usually from a performance engineering team or an engineer that…a test engineer that is getting to performance. And those are the ones who start doing this test. What we are seeing is there is a technical barrier or any tester or even manual testers that are testing their apps on their phones to access this data. If we only rely on the profiling tools, that involves installing the IDE, having access to the code of the app and many things that usually testers don't always have. And that's where we find that if we can provide a way that without the need to access the source code of the app or install heavy tools or analyze a lot of data and an easy way to show this information for the tester and developer also as well because when you are reporting a bug or something from the user experience perspective, you want to give as much information as possible to developers as well. So I think that's where we found that there was at least one specific need that we can solve with our solution that basically captures everything that's happening during a manual or an automated test on the mobile side performance and then generate a report that you can share with anyone. So we are trying to democratize it so that only those teams that have the dedicated performance engineers looking into this but basically our vision is that anyone developing a mobile app should be able to access this data as fast as they can.

Joe [00:20:22] That's great. Segway that was one of my next question. What did your company Apptim focus on? So to me, I haven't seen it yet. I haven't used it yet. It sounds like it's not a dashboard, but a place where everyone can go to find out the status of the performance of their applications I guess data collection or…What does it do before I make guesses about it?

Sophia [00:20:42] That's a very good guess. Yeah, we have been…we have a lot of clients that call like this the one place for our client-side performance. And these are typically mobile teams that are working both the developers and testers together. And our solution, in a way, what it does is it monitors the app while you're running the test and it works both with manual testing or automated tests. So in our case, we have the users, our functional testers, and this crowdsourcing platform that we talked about before. They don't have any automation. They don't have any background performance. They're just running their test cases and they have a tool that we provide them, which is a desktop app that they should use to monitor and record everything that happens in the test. And they also have evidence, not only performance, but they have a video of everything they did. They have logs. We capture crashes. We do more than just performance. But the way it's used by these functional testers is a very easy and friendly UI. So they don't have to dig deeper with delay by telling a lot of dependencies. So we tried to make it super smooth for them. And that's the first product that we launched. It's free, actually, so we want everyone to use it and just get the voice out there. Then perform on something that you can start testing early on. You don't need to get very sophisticated to gather these data and actually any tester in the team could use it. And that's one of the solutions that we have. The other one is mostly used by test engineers or QA automation team, that they already have a lot of tests automated for functional testing. They're probably running in the cloud in the CI/CD pipeline, and that's where we have a solution that they can add to their current test. And instead of only getting the results from the functional test, they also get all the insights and performance data that Apptim collects. So in a way we're adding a layer of data to all the regression tests they're running so they can also capture and in time see if there's any change in performance that might be due to a new change in the app pro version that was released. So we have both options today. One is more for like local kind of model testing and the other one runs in the cloud with real devices. And you can automate most of the process and just get as an output of the reports and then see the trends on the performance of your app, which is something that it's very interesting as well, seeing how we might be affecting also certain aspects of our app while we add more features as well. And of course, as any application or website as well, if you don't test it off then you don't know until someone complains. So that's exactly what we're trying to do with automating the performance testing, testing on different devices, because that also affects the performance because they have different hardware and resources are not the same. So this is something that we are looking into improving that part as well, kind of continuous performance testing.

Joe [00:23:55] I love this idea. I used to work for a very large enterprise and it was very hard to know what was going on with other teams. And a lot of people were like, I'm just responsible for this piece. But it sounds like this type of solution helps give insight and visibility to everyone about performance. So it could maybe help the culture to be more performance-oriented I guess.

Sophia [00:24:15] That's exactly what we're trying to do. And we have a lot of clients that they love that as well. Someone in the team is trying to push for more of this type of test. And what we realize is there is a lot of information that is not being captured, both on the manual testing team and the automation team if there's any test that's being automated. And we provide kind of like one platform, one with two different solutions that we have today, we can see everything in a centralized space so the QA monitor or even developer can access all this data that's being collected by Apptim in each area of the testing process. And that's where we want to go, being able to if left testing both with automation and even without automation as soon as you have the first version of your app, you are not going to start automating test because it's probably going to change a lot. But you might look you might be interested to know the user experience and the performance of your app. And that's where I think we can help early on. And then if you start getting more tests automated and increasing your coverage, I think also we can provide you feedback on how your performance is evolving with time.

Joe [00:25:27] Great. So I did first hear about a session you did a neo test on a…was on the user performance type of angle. So just curious to know how to improve user experience does your tool or in general, there are certain areas people should be looking at to help with user experience?

Sophia [00:25:44] Oh yeah, there are different areas. When we think about testing for user experience and how mobile performance is also I would say one of the main factors that affect the user experience so we can identify there is something not working properly. Let's say we're using our app, it freezes at some point, or as I mentioned before, it might be very slow and there are different areas where we can look at. One of the things that we started doing besides what I mentioned, that is resource usage, in general, is the drawing times and rendering times that this is very important for games. So any game they have game apps they have installed, they need surrendering levels to provide that experience to the user. And other apps are also turning into measuring how much time it takes for them to render screens while the frames per second on their apps and they're putting targets as KPIs in performance. And I think this is an area that not always everyone is aware of. And it's something that you can incorporate as things you see that really affect the user experience. High rendering times might provoke disrupting the animation of your app. So things like glitches or screen freezes, as I mentioned, can be a consequence of this. So in a way, we are also going bottoms up. We are showing you the raw data below. And then usually you can correlate that to this is where my screen freeze and then I can go and find the root cause. Well, at least know that there was probably a rendering issue and then go to the root cause of the developer. But kind of tracking this information can help you directly identify issues that a user experiences. Of course, anything related to crashes, exceptions that happen in the app, things that are caught, things aren't caught, things are known by the team or not, and everything that happens on the front end should be also something that we need to be aware of. And then last but not least, which is actually an area that we're really interested and Apptim is going further and exploring and researching ways that we can help teams to capture this data is response time but from the user experience, meaning I click on a logging in an app I'm using, and then I want to measure exactly how much time it took for the app to be available on the next screen and how long it took for the screen to be active for the user to interact. And this is not as easy as web, because web we have the DOM and there are ways you can know when the page finished loading and when the content appeared to the user. I'm pretty sure you're familiar with a lot of tools that show you specifically when the content is available and things that you can optimize. For mobile native, it's trickier and of course, is completely different for IOS and Android because there are two different platforms and even how the apps are built. So Android has activities and you can measure activity times, but it's really hard sometimes to know specifically if you want to measure these in an automated fashion, let's say a logging time from a user perspective. You have to either try to do with a script or as we are trying to research more now, finding ways to provide our testers and our users with measuring specific transactions that they want and then comparing them. So let's say I don't want my logging time to be higher than three seconds, which is typically kind of what users will allow the most. So you want to be measuring that in every new iteration and when you release your app in any way you don't want to do manually and then with the test automation that happens today, usually, there's an overhead added by the tools that are running the automation so it's not as true to the real user experience as it should be. So everything related to response times, measuring from point A to point B in your app, how much time it takes, and then replicating that in different devices and then seeing over the time how that evolves is something that a lot of companies are looking for solutions. And I think that there might be a way to help there and provide something new that it's not available today. And the other thing that is also specific metrics for the user experience is how much time it takes for the app to start, which is the app startup time which means when you click on the app and you open that, you can measure that first, then it opens and also when it goes to the back and it stays open in the back of the back stack in Android and then you push it again to the front. And there are specific technicalities and differences between those two but you want to measure that as well. That is actually one of the first steps that a user has to go through opening app. So if it takes too much time to launch, they might just not even use it. So it is something that impacts directly the user and you can start measuring this right away.

Joe [00:30:55] So, Sophia, you speak with a lot of clients, and has anyone asked actually what's the difference, or should they even have a mobile app compared to a web app? Because a lot of times I've been hearing recently, the mobile apps are dead, but I'm not sure that's the case with your thoughts.

Sophia [00:31:09] Yeah, a lot of users and people I talk to within the community, they usually ask me, should I have a mobile app or a web of what's better? I don't want to get into details. I will say it's really different. It really depends on your business and what's the goal. I'll say usually the mobile apps provide a way to actually deliver faster content and a better experience, the native mobile apps to the user. And also today, there are new technologies coming out that I think anyone can try when you're deciding what you're going to be developing if you need a website that you want to access. A lot of I know a huge amount of people anywhere in the world that can deliver content faster and it has its benefits. But if you really want to look into more personalization, having a more intuitive interface, being able to work offline as mobile apps can do, I think that's a better option. And I will think about performance in the early stages when deciding what to develop and knowing that for each, even if it's a website, you need to start earlier testing for mobile performance. What I mean is…what the performance of your website or app on the mobile device? And there are tools for this. So this is something that it's really available today. And from our position, we're trying to help today the native app developers and testers to get this information faster.

Joe [00:32:43] Okay Sofia, before we go is there one piece of actionable advice you can give to someone to help them with their mobile performance testing efforts? And what's the best way to find and contact you or learn more about Apptim?

Sophia [00:32:51] Oh, so the one piece of actionable advice is, I think already said it, but start as soon as you have the first version of your app. You don't need to wait to go live and have users complaining or reporting bugs or issues. If you are developing iOS, I think you have an easier life because it's way less models of iOS and also the version of the operating system in mobile changes a lot how it performs. So I would say start as soon as possible. There are profiling tools I mentioned for a developer. Apptim is a tool that is new for testers so we can get started for free. Just got to our website. Download it. Familiarize yourself with everything that we talked today about metrics and what things to look for. And then in the sense of how to contact, we can go right to Twitter. I'm sopalamarchuk which is my surname, and then also through email it's Sofia, my name @apptim.com. And the easiest, easiest way, if you're already on our website, is actually just go to the chat on the right bottom corner and I'm actually going to be there as well.

 

Rate and Review TestGuild Performance Podcast

Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Alex Kearns TestGuild DevOps Toolchain

Leveraging GenAI to Accelerate Cloud Migration with Alex Kearns

Posted on 12/18/2024

About this DevOps Toolchain Episode: Today, we're diving deep into how you can ...

Three people are pictured on a graphic titled "AI Secrets You Should Know." Set against a striking red background, the image features the ZAPTALK logo in the top left corner, highlighting discussions on AI and automation.

The Secret to Embracing AI and Automation (ZAPTALK EP 02)

Posted on 12/17/2024

About Episode Join Alex (ZAP) Chernyak, Joe Colantonio, and David Moses in episode ...

A person is speaking into a microphone on the "TestGuild News Show" with topics including weekly DevOps, automation, performance, and security testing. "Breaking News" is highlighted at the bottom.

Top Gift For Testers, 70% Problem, Test Coverage and More TGNS144

Posted on 12/16/2024

About This Episode: Do you know the perfect Holiday gift to give that ...