Automation Testing

Testing LeanFT with HPE LeanFT Test Team [PODCAST]

By Test Guild
  • Share:
Join the Guild for FREE

Welcome to Episode 97 of TestTalks. In this episode we'll discuss how HPE actually tests their own functional test tool, LeanFT. This is a rare, behind-the-scenes interview with the actual LeanFT QA test team.


HPE LeanFT Test Talks

Have you ever wondered how test tool vendors go about testing their own products? What tools and techniques they use? And do they actually use their own testing product(s) to test their software? Find out in this episode, where we Test Talk with the HPE QA test team that performed the testing on their new functional test tool, LeanFT.

Listen to the Audio

In this episode, you'll discover:

  • What is LeanFT?
  • What is HPE’s internal testing strategy?
  • An approach that allows you to test multiple technologies and SDKs with one test
  • Tips to improve your to improve your LeanFT testing efforts
  • How a large company implements Agile testing.
  • Much, much more!

Join the Conversation

My favorite part of doing these podcasts is participating in the conversations they provoke. Each week, I pull out one question that I like to get your thoughts on.

This week, it is this:

Question: What language do you prefer to use to create your test automation scripts? Share your answer in the comments below.

Want to Test Talk?

If you have a question, comment, thought or concern, you can do so by clicking here. I'd love to hear from you.

How to Get Promoted on the Show and Increase your Kama

Subscribe to the show in iTunes and give us a rating and review. Make sure you put your real name and website in the text of the review itself. We will definitely mention you on this show.

We are also on Stitcher.com so if you prefer Stitcher, please subscribe there.

Read the Full Transcript

Joe: Hey Guys. Welcome to Test Talks.

 

LeanFT Team: Hi Joe.

 

Joe: It's awesome to have you on the show. I'm really excited about this interview because it's the first of it's kind on Test Talks and that is, I actually get to interview the team that tested LeanFT. What I'd really like to get into today is how you actually test in house at HP the LeanFT product. I guess before we get into it though, can you just tell us a little bit, how you approach testing? The planning stages of testing LeanFT when it was a brand new product?

 

LeanFT Team: Right so basically when we started testing or plan testing LeanFT, we decided that most, if not all the testing, will be done using LeanFT. We were able to use LeanFT to identify objects and then activate all the methods and actions using LeanFT. On these objects, we used this methodology in order to test the functionality and create credibility of this tool so these were the first steps. Obviously we started with small amount of actions and testing objects that we're available to identify, but slowly slowly it grew.

 

Joe: You brought up an interesting point and I guess this would be a question a lot of people would be interested in. Usually, if they used QTP or UFT in the past, one of the biggest issues was object recognition issues. A lot of times they would blame the vendor so how does the actual vendor work around object recognition issues within their own applications? How did you work with development to resolve that?

 

Aviad: The way we learn an object or identify objects improved slowly during the years. We know, developed, identifying more types of objects and in more technologies and we tried to, and this was the development process, to learn and comment properties and attributes that would make the test or the identification more robust so we won’t be dependent on location or on the hierarchy of the object in the application.

 

We use a lot of AUT's that we got from let's call it outside, from customers, to verify development. Later on, which we'll get to later in this discussion, we were able to replicate or understand what the customer needs by the way their applications are developed or written and we we're able to create these applications in house.

 

Hagey: I want to add to Aviads statement LeanFT     added another tool named Spy. It's a great tool that helps the user identify the objects and the properties that the user can base their identification of the object with. It's very helpful.

 

Joe: I think that's a great point. A lot of people, when they use Selenium, they use something like Firebug to do object recognition, but with LeanFT there is a great object recognition tool now that you can use more than 1 property to identify elements that I think it's been really useful. Was that something that was developed earlier on in the process to help you start testing? Or was that developed later after you recognized that there was a need for a more all-encompassing tool to help recognize objects?

 

Aviad: This was developed as the first stage of developing LeanFT. We understood that the user needs something and it will help them identify more than just the browser ability to give you the attributes of the object. We improved it together with the development of LeanFT to meet our requirements as a QA team. It will fit the QA teams or the testers, whatever we decide to call them, for the customers as well.

 

Joe: You mentioned earlier that when you were testing LeanFT, you mentioned you had a lot of customers to work with to find out what their needs were. How do you prioritize what you test based on customers? Is it based on how many requests you get or is it just based on you know that something's really going to be a high priority even though you only have 1 or 2 customers asking for it?

 

Aviad: We tried to oversee the technologies the customers are using, objects they are using in their specific niches. We obviously aimed for web applications. Later on, we implemented HP Mobile Center as mobile is a growing technology, to focus our testing and our development and their testing on the mobile technology. Obviously taking from the variety of customers we have as UFT or QTP taking their applications that they're testing and their technology and try to work from their AUT technology.

 

Malcom: I think it's also fair to say, and when we do go ahead with the implementation of a feature we try to implement it in as generic a fashion as possible so that we're not tied down to a very very specific way . There's often more than 1 way of identifying an object into an application. We try to make sure that we're not tying with the user down in a way.

 

Aviad: I must say that we were lucky enough to have a big customer database, the UFT customers, which we were able to use and learn from them what they need and then to aim the product exactly to their needs or with most of their needs.

 

Joe: Excellent. Now you also mentioned HP Mobile Center. The reason why I ask that is I work for a large company and a lot of times I have an application I need to test that integrates with another application, but my group doesn't necessarily test or develop that application. We always have issues saying “well, this … There's a bug in the other application and we need to wait to get that resolved before we can move on with ours,” so did you encounter any of those issues as you were trying to integrate with other HP products and how did you work around it?

 

Aviad: Obviously, when we have 2 teams working on different products, which integrate between them or even 1 dependent on the other 1, there might be different bugs or un-complete features or anything like that but the communication between the two teams and the fact that both teams are simply in the same building, one level up than the other and the communication between the people, that's one of the most important things. The team leader is in everything so created an environment that allowed us to handle issues as soon as they are found and get the resolutions and focus either product on what we need in order to complete the integration. Being one of the tools of Mobile Center, one of the tools that we encourage customers that want LeanFT to buy Mobile Center as well, HP Mobile Center wants to improve in this area and give us the best integration they can. Obviously, that helps us.

 

Ayelet: It happens a lot with HP that with our integrations between one product of HP to another product. I think, as Aviad said, that personal communication between teams, the collaboration between them, is the one thing that makes it happen very successfully here at HPE. It's not only between LeanFT and Mobile Center, it's with a lot of HP's product. The communication between the teams is very good. The teams and the pillars and between groups and not even in the same business unit.

 

Joe: Awesome. That's a great point about collaboration. That's one of the most difficult pieces I actually struggle with is collaborating with multiple groups, across multiple areas, across the globe. I've been at a few HP Discoveries lately, and it's almost like there's been a transformation at HPE where it's almost like you guys are embracing open source technologies now. When I go to the HP Discovery I've been hearing more about Agile and DevOp so I'm just curious to know, at a high level, how is the testing effort changed at HP from maybe when you were testing QTP to LeanFT? Did you use more Agile DevOps practices or did you still use a more waterfall-type approach?

 

Hagay: Developing in HP changed a lot, as you said from Waterfall to Agile. Quality teams align with the same process. I think that almost all teams, all products in HP moved to Agile technology working with a [inaudiable..]. Both Dev and QA teams aligned to the Agile process. Of course, in order to have choice of pattern and we are using the continuous testing for this integration in our processes.

 

Aviad: If I may add to Hagay answer, I will say that some, if not most of HP products at the moment are focusing on groups that are doing Agile and will help improve the Agile process and therefore, we must use them. We use our tools in house. We must and we want to do the R&D process or development process. We want to do it Agile, as much as possible to experience what our customers will experience using our tools, basically. Eating our own dog food is really important for us.

 

Joe: Just curious to know, because you're developing LeanFt, you're the testers of LeanFT the automators, since you have all that knowledge, how does another team leverage that knowledge? Say you have ALM, does ALM actually use QTP or LeanFT to test ALM and if so do you have a separate automation team that does that or testers integrate into each of your sprint teams?

 

Hagay : I can answer that as a center of excellence team, we have the vision for all the groups and of course we're working with LeanFT with the QA team but we are working with other teams, with other pillars that related to LeanFT. We are consuming LeanFt as a product. We are really using our own product in order to test our own product. The last example is TruClient. We are automating TruClient using LeanFT and because of the complexity of TruClient, it is using couple of the technologies, web and winform and WP. We have a lot of running mode. LeanFT really helped us overcome those challenges. To your question, yes, we are consuming and using our own products in different pillars, not only in the pillar that LeanFT is developed.

 

Malcolm: It is continuous feedback as well because as we use LeanFT across the other parts of the organization, we discover areas within LeanFT that need to be improved on. Even the bugs might surface in LeanFT. That might spread back to the LeanFT team which can go ahead and fix it for us.

 

Hagay: This is a good point Malcolm. It's a win-win situation for both the TruClient, for example, and LeanFT. We found bugs in TruClient using the automation, but we found bugs in LeanFt as well. It really helped improve stability. In addition, we're using the nights and consuming the nightly build with LeanFT. It's agile as it can get.

 

Aylet: I can add that, in our Automation Center of Excellence, we have two roles. one role is to test the LeanFT as a product, to find bugs, to make it better and another role is we do to take this LeanFT product and use it with other products. The feedback that we get within these two roles are different because when we test the LeanFT, using LeanFT, we find some bugs that are not really the bugs that the real user will usually find but when we complete the role as a customer of LeanFT within our house, we find bugs that obviously our users in the end will find also, so we have this product to become better more quality and eventually when it will release it, it will be released without these bugs that users will find also.

 

Joe: I bet that could be really difficult to do that, to keep track of your own product, but how it integrates well with these other products. I can see how testing a functional test tool is probably a lot different than people are familiar with so kudos to you. I don't know how you keep track of that. Do you use ALM to keep track of all your defects and everything?

 

LeanFT Team: We use our tools.

 

LeanFT Team: Not ALM.

 

Aviad: Not specifically ALM. LeanFT has a couple more tools of managing our development process. One of them is AGM, which I'm sure you're familiar with. The Agile Manager, by HP. We have a couple more tools that are still not released to customers that we test. Just like other groups use LeanFT to test their tools, we use other HP tools in order to manage our release. They're not yet released to customers. I'm not sure if I can say their name.

 

Aylet: It's a new generation of ALM basically.

 

Malcom: We're always using the latest versions in development of our products over here.

 

Joe: You did mention earlier also that you do use continuous integration approach so do you have a suite of LeanFT test suites that you run at every co-check in, or is it once a night? What's your approach to continuous integration?

 

Aylet: All right we have several suites from each pipeline. For the continuous integration, we run very short tests and then for end tests on one technology of the set of technologies that LeanFT supports. We have additional pipelines that will be the regression pipeline and the nightly pipeline. Within the nightly, we are making the suite lighter obviously. We are running the end to end test for the entire technologies that LeanFT supports.

 

When we are going to test the regression testing on a weekly We always check on weekly basis, we're going to test not only end to end, but the regression test meaning that we are going to take, actually we already do, we are taking each one of the controls that is supported in LeanFT and test all the properties and actions that are exposed in the LeanFT SDK. This test covers all of the functionalities that LeanFT suggests to customers. We have several pipelines in order to make it more agile to make a quick feedback to the developers and the QA team.

 

Asviad: On top of that, I can add that not only all the objects that are supported by LeanFT are tested either nightly on the regression suite for testing all the tools that LeanFT supports. Sorry not supports, exposes. Some of the tools are application of there and obviously despite and the LeanFT report that is generated at the end of every run, all these tools are also tested as part of their regression cycle and some of them are tested in the nightly process as well.

 

Joe: This might be a controversial question, I'm not sure, but I could never get a developer to use QTP or UFT as part of their development effort. I'm just curious to know, maybe it's different at HPE , but when you created LeanFT, now that LeanFT is out there, have you seen any developers at HPE using LeanFT with C# or Java to create any sort of tests themselves?

 

Hagay: I can answer that. Basically it's really new. It's hot, in the last weeks. We are in contact with long line of development team. They are looking for a solution automate LoadRunner. One of the person that leads this future team is POC with LeanFT team and he was so excited. He just want to start a project without any further notice. He just want to start automate LoadRunner using LeanFT because it was very easy for him to understand, to install, and to start covering mobile and mobile is a complex product as some of the viewers know. Not easy one to automate but that team of mobile has really enjoyed using with LeanFT.

 

Aylet: I think the good feedback that we had, good experience that we had with automating TrueClient made the decision for the LoadRunner very easy to choose LeanFT also. It's also a complex product here at HP and I think the quick impact that we have in TruClient convince them to just choose it also.

 

Aviad: Joe if I may say a bit more, this is LoadRunner and TruClient examples but the whole concept of LeanFT was to make the tests more closer and more available, and easier for the developer because till now the tester who used HP tools, which was UFT, had it's own IDE and wrote the test in VBScript and now the whole concept was to get into the ecosystem of the developer, into the IDEs the developers use. Visual Studio, Eclipse and IntelliJ and have the tests written in C#, Java, and JavaScript so the developers can actually debug their tests on their machine made it much easier for developers to inherit or to accept and use. I think LoadRunner and obviously TruClient within HP and some other groups that are using LeanFT within HP is a great example of how this affected developers and made it easier for them to adopt LeanFT.

 

Joe: I'm an automation geek and I just heard you talk about automating LoadRunner with LeanFT and I think that will probably be so cool to do so that's awesome. I guess my other question is, once again I'm not sure how much to get into it, I get a lot of questions from people about LeanFT saying, “Does it work with Selenium?” When you're testing LeanFT, did you ever test it using Selenium or did you always just use the LeanFT web SDK piece of it?

 

Aviad: At the moment we used LeanFT SDK in order to test LeanFT but obviously we know that Selenium is a huge tool in the market and we try not only to have people use just LeanFT but to have people implement LeanFT into this new suite. We have a suite of tests that has both LeanFT and Selenium together. Basically they can live in the same environment which means we can run one test that has both Selenium code and LeanFT code and it works. We test that and verify that as well.

 

Joe: You also notice you mentioned JavaScript as one of the languages. I guess that's a newer service pack that came out that allows you to use LeanFT with JavaScript. Have you been seeing a bigger demand for that to be one of the languages that should be used as you've released it?

 

Aviad: Definitely and this is why we aim for it and we see the market moving into more of script-based languages. JavaScript was the first one that we decided obviously to use and support. This is why we added the support of LeanFT into this language. We plan, in our roadmap, to add more languages as scripting languages. The LeanFT will support it. We already have the maps from customers wanting LeanFT to support other languages.

 

Joe: Could you just tell us about some of the testing challenges you may have testing LeanFT?

 

Hagay: LeanFT supports eight technologies and three SDKs. Further we have some product because we have been using port between each technology but at the end of the day, we need to test it all. We need to test all technologies, all SDKs and we need to test all controls and all the properties and methods of each control in order to understand a very small regression in the LeanFT functionality. Just to give you a test and idea of how many tests we've to be written in order to test 1 control LeanFT, so LeanFT supports eight technology in let's say for example through SDK and each control has around 60 properties and methods. This is what the 900 tests for each control to be written, okay? Each technology have around 20 control as an average. This is huge amount of codebase just to test LeanFT just to understand that there is no regression in any control and any product places, a huge challenge to automate.

 

Joe: Are there any tips that you can give to how you overcame those challenges? I didn't even know how you can keep track of all the SDKs and all the methods. How do you know that you have coverage for everything or a test for every piece of functionality that LeanFT covers?

 

Hagay: Okay in order to test all the countless challenges, we first have to build AUTs for this technology. There is no shortcut in this area. LeanFT works on some AT, some application of the test. It works on a technology. We need to create AUTs that fulfill all the testing requirements for the LeanFT so no shortcut is. We created 8 AUTs that looks exactly the same when we do that. It depends on the context but they do exactly the same. The functionality of the AUTs is the same. It can be web AUT or mobile AUT. Of course they will not look exactly the same but the functionality is the same.

 

Ayelet: Is the same story.

 

Hagay: Is the same story for all AUT. When we have that, we wanted to reduce the amount of codebase. We wanted to remove 1 of the multiplier from the equation so we wanted to write only 1 script in order to test all technology. In that way, we can reduce the multiplier of a technology, reduce the codebase significantly. In addition, there are a lot of tests that are located that of common controls. We will do it only once in a basket so we remove all those tests to base test as well in order to reduce the amount of codebase around.

 

Aylet: I will just add that in order to reduce eight technologies that are multiplier, we use an abstraction layer in order to simulate one language for all technologies. We used a lot of reflection, dynamic proxies in order to investigate the SDK dynamically and have the methods and properties tested on the fly and not written part-coded for the specific technology. This structuring layer reduced our code base significantly and the inheritance for the tests also make it smaller because we didn't repeat ourselves and we had a lot of code we use in our testing framework.

 

Hagay: At the end we reached the point that the same test works on web AUT, or mobile AUT or WTFT, whatever. Technology is the same; codebase the same test, one script for all the technologies.

 

Avaid: One script to rule them all.

 

Aylet: Obviously our customers won't have the same application on a technology but if we are taking this concept to the real world, we already see in the market that many companies have the same application on web and mobile. We want them to adopt this concept of writing one script for both mobile and web. Eventually the functionality is the same. Clicking on a button where it's on a mobile or where it's on a web application. The task can be written the same if we just implement this structured layer that you have been this common language for both technology.

 

Joe: Awesome. I think that's a great piece of advice, to make layers when you're creating an automation framework so if you want to do something where you could test a bunch of other things behind the scenes, you could do it in just one location. That sounds like you're incorporating so that's awesome. I guess I had one other question as you were talking about … I know LeanFT supports Behavior Driven Development using Cucumber. Have you used any Behavior Driven Development or have you used Cucumber as part of your efforts to really abstract out having features as a customer I want and then you create layers that way?

 

Aviad Yep we do. We have projects and tests, basically written in Specflow and C#, and Cucumber and jBehave in Java. Jasmine is already BDD testing framework. We use all the as part of our regression suites. It works quite smoothly. There's not much that is needed once you've implemented the Cucumber or the BDD framework, there's not much needed in order to make it work with LeanFT. It's quite easy. Simple.

 

Joe: Awesome. Now I think this goes to, once again, the theme I've been seeing at HPE world Discover is that you really are embracing open-sourced technology and Behavior Driven Development, using Cucumber and Spec flow are a part of that so it's awesome that it really does fit into it, it seems into developer's ecosystems. Should be seamless if someone starts with LeanFT. The developer should be able to get involved fairly easily with it.

 

Before we go, is there one piece of actual advice you can give to someone to help them with their LeanFT testing efforts? Since you guys have been using LeanFT for testing LeanFT, is there anything you've learned through like one piece of actual advice you can give someone to help them with their LeanFT efforts? Let us know the best way to find out more about LeanFT.

 

Ayelet: First of all I can suggest the users of LeanFT to take a look at the help center of LeanFT. It's online, the web application. You can Google it and find easily LeanFT help center. There is a very detailed documentation there. In addition, we have some blogs that are published all about the apps of HPE where a little bit more spicier of LeanFT functionality is being described there such as object recognition and some goodies that are LeanFT suggest to its customers.

 

We are constantly adding more posts to these blogs of all about the app and to the help center. The help center is a documentation for LeanFT but it's dynamic. You have example of card code snippets there where you can actually copy the snippets and use it in your own tools, your own products in order to test them. It's very easy to navigate there in the help center, and to use it, and to understand. Even if you're not yet a customer of LeanFT, you can Google it and find it in Google very easily and understand how LeanFT operates, what is the benefit, what is the advantages. If it will be easier easy for you to use LeanFT even before you consume this product. Anything else guys?

 

Aviad: On top of all the examples and information that you can find in what HP delivers, not just HP, but you can find a lot of blogs and forums that are talking about LeanFT outside of HP. A lot of users share their experience, knowledge, and information on LeanFT, each one taking it to a different direction. Obviously these places on the internet, wherever they are, it can help the user improve their test and their usability using of LeanFT. There's also the free trial of LeanFT which means you don't have to start by buying the product. You can start using it for mass, sometimes even more, if you require an extension and once you've decided that it is good enough for your needs, you can decide if and how many licenses you need to buy.

 

On top of everything, I think one of the things that LeanFT pushing for is just to make it simple. Basically give all the tools that are required by the user in order to automate their test which means we start the LeanFT project with all the references and using or import that the user needs. We give all the tools the report that is generated and the LeanFT Spy with recommendations. Everything that the user needs, they already have in their hands. All that you need to do is just start coding.

 

Malcom: In effect just download the free trial version and start working with it.

 

Aviad: Exactly.

 

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Symbolic AI vs. Gen AI: The Dynamic Duo in Test Automation

Posted on 09/23/2024

You've probably been having conversations lately about whether to use AI for testing. ...

8 Special Ops Principles for Automation Testing

Posted on 08/01/2024

I recently had a conversation, with Alex “ZAP” Chernyak about his journey to ...

Top 8 Open Source DevOps Tools for Quality 2024

Posted on 07/30/2024

Having a robust Continuous Integration and Continuous Deployment (CI/CD) pipeline is crucial. Open ...

Sponsor The Industry-Standard E2E Automation Testing Annual Online Event (Limited Spots Left) - Reach Out Now >>