Unifying Manual and Automated Testing Efforts with Javier Re

By Test Guild
  • Share:
Join the Guild for FREE
Javier Alejandro TestGuild Automation Feature

About This Episode:

Welcome to the TestGuild Automation Podcast! Today, we explore the synergy between manual and automation testing with expert Javier CEO of Crowdar, and Founder of Lippia.io. We discuss understanding manual processes before automating and ensuring both testing methods align for consistent functionality. Javier also introduces an open-source framework for Gherkin tests on APIs, emphasizing Gherkin's role in behavior-driven development (BDD).

We touch on the importance of syncing manual and automated changes and dive into Lippia, a platform-enhancing test management system. Javier shares the benefits of integrating both manual and automation testing on one platform, using a Cucumber implementation for unified efforts.

Javier highlights the broader role of Gherkin, involving not just testers but also product owners and developers. Tune in for insights on harmonizing your testing for better efficiency and quality with Javier. Listen up!

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 34,000 of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together.

About Javier Re

Javier Alejandro

Javier Alejandro Re

CEO at Crowdar and Founder of Lippia.io.

He is dedicated to continuous growth in IT consulting, software quality, and test automation with high-quality standards.

He has a Software Engineering degree, a PMP Certification and a Master in Business Administration to enhance his business skills.

He has more than 15 years of experience in Technology applied to Business.

Co-created CrowdAr in 2013, after a series of successful projects in Argentina, and has expanded its operations to Manchester, UK, developing the first integrated BDD Automation Test Framework for the Cloud.

Connect with Javier Re

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:04] Get ready to discover the most actionable end-to-end automation advice from some of the smartest testers on the planet. Hey, I'm Joe Colantonio, host of the Test Guild Automation Podcast, and my goal is to help you succeed with creating automation awesomeness.

[00:00:29] Joe Colantonio Today, we're going to explore the synergy between manual and automated testing with expert Javier, who is the CEO of Crowdar and founder of lippia.io. In this episode, we discuss understanding manual processes along with automation, to ensure both testing methods align for consistent functionality. Javier also introduces an open-source framework using Gherkin to test APIs, emphasizing Gherkin's role and behavior-driven development. We also touch on the importance of seeking both manual and automated changes and dive into Lippia, which is a platform-enhancing test management system. We also go over the benefits of integrating both manual and automated tests on one platform using a unifying cucumber implementation. And Javier highlights the broader role of Gherkin involving not just testers but also product owners and developers. Tune in for insights on harmonizing your tests for better efficiency and quality. And a little more about Javier, he is dedicated to continuous growth in I.T. consulting, Software Quality in test automation with high quality standards and has more than 15 years of experience in technology applied to business. He is also the co-creator of Crowdar, which he created in 2013 after a series of successful projects in Argentina, and has expanded its operations to Manchester, UK, developing the first-ever integrated BDD automation testing framework for the cloud. Really a lot of stuff in this episode. Jampacked. You don't want to miss it. Check it out.

[00:01:59] Joe Colantonio He, Javier. Welcome to the Guild.

[00:02:04] Javier Alejandro Oh, hello, Joe. Nice to be here. Thank you for inviting me.

[00:02:08] Joe Colantonio Great to have you. Really excited to talk to you about this topic. Before we get into it though, I'm always curious to know when I talk to founders on why they got into test automation or why they create a tool to help people with test automation or testing in general.

[00:02:20] Javier Alejandro Okay. Yeah. Well, briefly, I started in the QA world as a CTO before in Thomson Reuters. I used to have a development team without any QAs in my team. I decided to bring some QA people and make things work better in the team. That was a very good experience for me. That makes me think about the future. In the product team and then with time, we grew that team with more automation tools and different things. After my experience at Thomson Reuters, I decided to fund Crowdar and I chose the QA team because it's a matter of some needs in the market that needs QA processes and many teams in the development area and even in our customers that have internal teams need that process. We started working on that and that area and was funny because at the beginning, we didn't start as a normal QA services company, we started just from the beginning doing automation, which is kind of a strange thing, but we started in that way directly in automation, which is not the usual, I think kind of a start-up. But that's in general how we entered into the QA world.

[00:03:44] Joe Colantonio Nice. Like you mentioned, it's not normal to start with automation. So why did you start with automation then? Was that one of the biggest markets which you saw at the time?

[00:03:53] Javier Alejandro I think that's because when you just start, you pick the first project that you can find right. And say, what's a project that needs to be accomplished? Was maybe a muscle kind of project, so we needed to put effort to complete the test suite. The development team has a commitment to accomplishing a period of 6 months, have to reach an agreement, we have to reach a thousand automated test cases in 6 months. And the team that they had was pretty small at that time. So we added effort to that team to accomplish that milestone that they have committed with the corporation.

[00:04:37] Joe Colantonio I would think if you start with a lot of automated tests, over time, you'd have to have processes that couldn't be automated and then all sudden you have more manual processes. The more automated. How do you kind of bridge that gap? Would you define any gaps between, Okay, now we have to bring in information from our manual testing efforts as well.

[00:04:55] Javier Alejandro Yeah, I think of it. I have two different topics about that. One is as we start doing automation, we don't know too much about the manual process that these customers follows. But then when we started to do more and more automation projects because suddenly the customers requested us, to just do the automation, they brought the manual process and we executed the automation process. Sometimes we need to match the effort and to correlate the activities to make sure that the same things that the manual QA team executes in normal test cycles, we automate the same, or at least this in accordance with which test you want to automate and make sure that the functionality keep the same. One of the challenges that we find is as we use open source frameworks like Selenium, Cucumber, etc. we support test cases in Gherkin format and sometimes the customers use a regular tool in our base, step-by-step base kind of format. The main challenge with that is to match and to maintain the correlation between the scenarios that they are supported in one format for the manual process and in the automation process. So that correlation was critical because as soon as we change something, the manual test team has to do the same and vice versa. As soon as the manual testers add the newest scenario with their test cases, we need to change the automation. Obviously, with good communication that is possible, and no problem about that. But we find a difficult to maintain that correlation clean and smoothly. So yeah, then we started to experiment with some things.

[00:06:53] Joe Colantonio Nice. Just a case, I know this is really tester focus, but if anyone's listening they don't know what Gherkin even is like, What is Gherkin?

[00:07:00] Javier Alejandro Well, Gherkin is a language that supports or was invented to describe behaviors basically. It's not for testing in particular, it's for supporting a specific kind of way of building software called BDD behavior-driven development. So the Gherkin format basically a kind of natural language with specific keywords that you use to describe the behavior of your system. Basically, it has 4 to 5 keywords, like even, when, then, and then logical connectors like and or etc.. But it's pretty straightforward. So given a condition, then, when I do some interaction with the system, then I expect this behavior. So with that format, you can't describe the behavior of a system or a software and then test that accordingly. For us, it's basically a good common language that all the development team can use, not just the testers but also the product owners, the developers, the architect testers, etc. So with that single description of the software, you can make everyone talk about the same thing basically. One of the issues that I mentioned is that when you describe scenarios that you have testing in a specific tool, you have to describe it in a certain way or in the way that the tester wants to describe. So that is a matter of interpretation. So I think that the Gherkin language reviews the level of interpretation at at least the level are the same kind of writing the things. Obviously, you can write Gherkins with a specific, with more detail, with less detail. It's not an exact science, it's not mathematical, but it's a good way to approach a common language for the whole team.

[00:08:56] Joe Colantonio Right. It sounds like. Is that what you mean when you unify or you get the correlated activities as everyone is using the same terminology, regardless if it's a manual activity or manual tests or automated tests? So you know what's going on with both efforts.

[00:09:12] Javier Alejandro Yeah, that's one thing. But also using Gherkin, as I mentioned, includes the whole team, also the developers, the product owners, etc. As soon as you start using Gherkin or BDD to support your requirements in behavior, you can basically easily quote that as a product. Then you can easily test that. And as soon as you start testing that, as we start in the opposite way, as I mentioned, doing the automation instead of doing manual testing, what we learn is that our team, when they need to find a bug or a flaky test in an automation project with Gherkin scenarios, they use the same Gherkin scenario to test it manually. When they find a bug, for example, in a log-in scenario, okay, I found a bug here. Let's replicate their behavior manually to understand where is the bug and then correct or report the bug or correct the flaky testing into the code. When we realized that we think we realized that we can use the description of the scenarios that we needed Gherkin to do manual testing which is not a common behavior basically for the industry. That's why we started to say, okay, let's start writing Gherkin from the very beginning in any testing process, no matter if we will automate that later or not. And with that mindset, we started with that and we reduced a lot there to maintain that synchronization that I mentioned at the beginning. And also we have some barriers, right, because manual testers usually are not accustomed or easily eager to use a development environment or a repository of code. That was the barrier that makes the life of the amount of testers that has to use Gherkin not so easy. That's why we started building a tool. Basically, originally the tool was for us to do our delivery in the customers and we started doing that with that approach. How we can make life more easier for our testers to maintain that single asset as tester's scenarios written in Gherkin use it for the automation people or developers and for the manual testers as well?

[00:11:38] Joe Colantonio I know it's a hard time getting the testers use the repository, but is it hard to get the developers and business analysts to actually contribute to the Gherkin effort? Like, how do you know? Is it like you agreed on this is the terms we're going to use. If someone writes a manual test using these terms, it also can be used for automation or is it this different just for communication way of using Gherkin?

[00:11:59] Javier Alejandro No, I think it's a good tool and as soon as the development team and product owners start to use it, it makes their life easier for them to specify requirements, to call requirements because you can use it as a test for your code very early in the process, not just TDD of the class level testing, but you can use BDD as a functional journey level so the developer can use it to test their development. In some cases, some teams start writing the test in Gherkin format from the very beginning before writing the code. It's a TDD approach but based on BDD.

[00:12:44] Joe Colantonio Right. So you mentioned repository, so you said then you create a tool to help with the repository, so maybe you can expand on that. What does that mean? And just for folks that are listening to this, definitely check out the show notes, we'll have the embedded video for you there as well.

[00:12:57] Javier Alejandro Yeah, basically, as soon as we started to do this process, what we do? We started it from the end of the story, as I mentioned, instead of starting from a tool that lets you support your tester's scenarios and then automate that in the tool that you choose, we do the opposite. As soon as we have a code repository, we can start writing Gherkin from the very beginning, using the same repository that the developers use. In this case, I'm showing you a sample of Lippia where I have different processes, and the first thing that I have to connect to use Lippia Test Manager which is the tool that we are talking about, I have to connect a code repository and it's very easy to use because I can have my repository choose the management tool and put my credentials. That is pretty straightforward. Choosing the settings in any repository. And as soon as I have these credentials in place, I can see different things on my screen. One of the things that I can see is all the runs that I can do with Gherkin files. I can build a new manual run and put some Gherkin files on it, which is the most difficult part. In general, usually, when you use a test management solution, you support your test cases into the application and you depend on that application to build a run to report your bugs and to report the progress of that run. In this case, what we do is to connect to the code repository, collect all the features files that are included in that repository, and with a single pickup I can see I'm showing right now on my screen how can I see a Gherkin file as a test case? What we did is to put the Gherkin files as some scenarios, to use them as a normal test case, and use it as a guidance for the testers to execute those actions into the application under test. And simply you can report a pass or fail status to a solution. Look at every test. You can specify a different status for each test. And in the same way that I did this, I can also include. Any automated test that I run from my automation code. Using an API that is part of the solution. Any developer that does their test using Gherkin can inject results into the same platform and combine in a dashboard in a very straightforward way how was the results of an automated run and manual runs. I can combine both things in one solution so that both teams if I have that test divided between manual testers and automated testers or simply I have a kind of 5 testers that can code and make automation. He or she can have both results in a single platform using the same format. And the good thing that is despite that is that we are not a looking approach in terms of for people that want to use Lippia because they still keep their tests in their repositories. You can have it any Gherkin file from the repositories without using this tool. But if you use the tool, you have the advantage of making life easy for the testers.

[00:16:40] Joe Colantonio Also just correcting because you're Gherkin then is that why you use it in any programming language because it's just a text file so you don't have to worry about its Python?

[00:16:50] Javier Alejandro Yes. You have many implementations because we are based it on Cucumber, which is the most widely used platform for Gherkin and Cucumber has many implementations from Python, Java, JavaScript, and even products that are very popular like Cypress right now support Gherkin files. Karate is another platform or framework that uses BDD as a platform and Gherkin files. Obviously, most of them are standards, Karate has additional things that add to the Gherkin files and make some more specific features. But in general terms, any feature file including Behat from PHP, which is wonderful for customers to use extensively PHP and they have a lot of scenarios written in Behat. It can be collected and edited directly into the tool.

[00:17:41] Joe Colantonio One other quick question I'm thinking of when I'm seeing this is, I just work for a huge corporation. We had divisions, in that division with multiple teams working on different products and so you wouldn't necessarily know what Team B was doing if you were in Team A. So it looks like this because if someone's using Cypress if another team was using Java, another team using JavaScript if they'll use a cucumber implementation for that across a different platform, different tools or the different solutions that you're doing within your business division. So you're like a senior engineer can go in or a senior manager could go in and then see what the effort is across all those different programs, all within this one dashboard. Am I thinking this correctly as well as one of the benefits?

[00:18:23] Javier Alejandro Yeah, absolutely. You don't have to use the same architecture solution to automate, to use this platform. Any platform at this stage that can support BDD Gherkin can be connected to this. We are very fan of using Gherkin some obviously some taste and some people don't like too much, but we think that's a good solution. And based on that we do this. And just to quickly look about that, as soon as you can connect to a repository, you can be directly in the same way, developer with the difference that you're in touch, real gold Java or Cypress or JavaScript, whatever, you just focus on their behavior, which is what we try to approach with this solution.

[00:19:11] Joe Colantonio Because once again, it's in a format everyone understands. Developers and testers are able to communicate better as well. So if a developer creates a test, the tester knows what they did to test whatever, they test it because it's readable.

[00:19:23] Javier Alejandro Yeah, yeah. In an ideal scenario, the developer can start writing the test and then the tester could add more different scenarios to the same basic test problem. So it is a matter of collaboration. Writing and improving those scenarios is not just a responsibility for the test. It's something that the whole team has to be understanding. We always say with our customers that the quality team responsibility is not the QA's responsibility. We try to encourage them.

[00:19:58] Joe Colantonio I think in the press show you may have mentioned that Lippia has been in development for two years. I'm just curious to know when you started and where you are now, what are some insights or any insights into maybe the journey into building a tool, a lessons learned along the way?

[00:20:12] Javier Alejandro Yeah, well, an important thing is that we as a QA services company, we are not sitting all the time in the seats of the developers. We understand with completely how the developers feel because we need to build the tool right. It happens at the very beginning that we have some issues the same that the customers we helped at the beginning, we don't have good QA process at the beginning. We have to learn how to build software because when you focus too much on something, probably you lose perspective of the other things. We learn a lot about that. And the other thing that we learned about the integration process of testing a tool that is for testers, it's important to improve the process of building the solution and acquiring modern practices. We are very fan of DevOps practices and agile practices, and we only follow the problems by the implementation of those practices. We learn how to improve our product development process which is not the normal thing that we did as a service provider. We have to learn how to build software products being testers basically.

[00:21:34] Joe Colantonio Absolutely. I assume that one thing you probably learned also is it's tough when you start with the tool in your vendor locked with it and it seems like that is a feature you mentioned briefly how if someone starts using the solution, they're not locked to this. It's not vendor lock per se. If they created the testing Gherkin, those Gherkin can then live on after. If they had to move over to another system, did I understand that correctly?

[00:21:58] Javier Alejandro Yes, that's correct because one of the things that happens being small companies, is sometimes the customers, when you show the solution, they like the solution, but they are not so sure that if you can keep the support of their platform for many years, then they don't. They have the risk of taking the decision of..... buy a solution like this and then depend on a company for many years, so for many time. That is on purpose. We tried to do that on purpose. Obviously, there is some information that remains in the solution, like the results of the runs, etc. but your asset, which is the main asset because the runs are volatile. You make a run, a cycle, or a test cycle and then you can repeat that many times as you need. But we said that the scenarios and the Gherkin files are on assets like code. So your application is part of your application. So you have to be ownership or have ownership on that. We cannot keep on our solution.

[00:23:08] Joe Colantonio That's great. I know it's one thing a lot of people is a stumbling block, so I really appreciate you building that and that's a great benefit for sure. It's been in development for years, but I know you also have customers, I'm just curious to know, do you have any success stories or feedback from your customers on what they've used before?

[00:23:24] Javier Alejandro Well, yeah. We have a couple of customers that are using right now are 5 to 6 customers that we start using early as beta testers. And with that approach, we improve the features of the products because they are real testers using the product. One of the main customers that use is a bank in Argentina that we provide services for them for like five years now and we started doing automation for us here, for dynamic CRM from Microsoft, which is probably not the normal path. Usually, you just start building a new application with microservices and start doing the automation. In this case, they have this big installation of 3000 users in the whole bank and they want to automate because the process of setting a path for the resolution of claims in their bank are very complex and if they don't automate that every time that they make some changes they break some rules. We started with the customer and at the very beginning, when we didn't have the tool, we used to read the repository and do the process that I described it. As soon as we have the tool we offer to the bank and we start using that and showing the customer how it works, that process and this part of the tool is not well known as more commercial products. They're very happy with that and they plan to expand the usage of it tool, of Lippia for the whole team. There are a hundred testers in the bank and not just the people that work with us. For the other vendors that provide people to the bank, even internal employees of the bank that do testing. Very happy of that because it was a learning process. Obviously, they have the openness to try their solution before it's ready to the market because they start to using the solution a year ago more or less. And based on the strict regulations that the banks has to comply, we have an on-premises solution in that case. The specific for the bank is not a cloud installation but is the same architecture based on Kubernetes and using all the same infrastructure, but internally to the bank's cloud base.

[00:25:48] Joe Colantonio Awesome. How would you explain what Lippia is right now? Like, is it a test management solution? Is it a test tool? Is it a collaboration platform? Like what is it?

[00:25:58] Javier Alejandro Well, Lippia, for me, or at least for us, is defined as a test management solution, but a different test management subject, it's a test management solution based on Gherkin.

[00:26:10] Joe Colantonio Nice. I know it's you know, it's somewhat new, but I know you're always working on new things, new features. Do you see any new features of enhancements that you're currently working on that are going to make it even more powerful?

[00:26:21] Javier Alejandro Yeah, well, basically, we have a very big roadmap to improve. Obviously, we want to make it better as the time passes. Basically, one is that usability, usability is key, especially for a tool like this where it has to be agile, it has to be easy to probably the concept will remain the same, but giving more accessibility and easy to use for the testers. The other thing that we also write an open-source framework that let the users write Gherkin test for APIs without writing code. Basically, what we need is to add more terms or more words to use Gherkin as API testing without writing code, without writing Java, and without writing the implementation of each tag. With that tool basically what we expect is to avoid much more code in the automation process and make life more easy for the testing. That is another tool or another feature that we try to incorporate and eventually, we are planning to make smoothly integration between CI/CD tools and the Lippia. Basically, what we do is to build some adapters to add more frameworks that make the life for the developers easy to inject results. For now, we are building what we have Cucumber, we have Java, we have Karate, we are adding Cypress adapters. And the idea is to make the Lippia has an API. But to avoid dealing with the API, we are trying to build adapters that make it easy to automate frameworks to ingest data.

[00:28:11] Joe Colantonio So it sounds like it's extensible. Can someone use that API or is it something that say, Hey, we need this integration, can you do it for us or how does that work?

[00:28:20] Javier Alejandro No, it's open the API. Basically, to avoid the people dealing with the API.

[00:28:27] Joe Colantonio Exactly.

[00:28:28] Javier Alejandro We build the adapter so we have the adapter you can download from GitHub and start using, but obviously, new frameworks appear all the time. That's why the API is so quick to avoid.

[00:28:41] Joe Colantonio That's a good feature. It can grow as new tooling comes out, it's easy type-

[00:28:48] Javier Alejandro And also, we are open to have more people writing adapters to inject results.

[00:28:55] Joe Colantonio Awesome. What was the name of the open-source tool? Is that available now or is that something you're working on for the API?

[00:29:02] Javier Alejandro No. Yeah. Well, the original yes. As I mentioned, Lippia started as a framework, but in fact, it's not a new framework. It's a combination of frameworks, open source. Include if you want to test web applications, it's a framework that has Selenium and Cucumber. And among other things. Right now what we did is with the API version of that framework, which is basically based on REST Assured, what we did is to overcharge the Gherkin file to have some specific steps, standard steps that you can use to test the API that is available on GitHub under Crowdar repository. So there are some tools to do testing with web, API, mobile, and this low code version of API testing.

[00:29:53] Joe Colantonio If someone's listening to us to like, Oh, this sounds great, how easy is it to get started? What do they need to do to check it out or give it a spin themselves?

[00:30:02] Javier Alejandro Basically, to use or to try Lippia is just very easy to go to Lippia.io and you can request a demo for a trial period and we can set up an instance very quickly.

[00:30:19] Joe Colantonio Awesome. Okay, Javier, before we go, is there one piece of actual advice you can give to someone to help them with their BDD automation testing efforts? And what's the best way to find a contact you or learn more about Lippia?

[00:30:30] Javier Alejandro Well, basically to know about Lippia is this page that I mentioned Lippia.io with two Ps. Lippia. The advice for BDD is I think that sometimes it's not easy to start using because it seems like it's an overhead in terms of effort to use it as a platform, as a solution. But I think that it's good to see the wider picture, not just seeing as a solution to the automation, but this process to do better development and to describe behavior. And with that in mind, it's better to have that approach in terms of how your software will be documented, test it, and developed it. So I think it's a good combination of a tool for the whole team, not just for the testers.

[00:31:20] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a471. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:31:56] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Attack of the Clones, ChatGPT Test Data, WebRiverIO BiDi and more TGNS102

Posted on 11/20/2023

About This Episode: Have you seen the new WebDrive.io BiDi protocol feature in ...

Igor Dorovskikh TestGuild Automation Feature

Unveiling the Potential of XCUITest and Espresso with Igor Dorovskikh

Posted on 11/19/2023

About This Episode: In this episode, I sit down with the CEO and ...

Open source AI for Playwright, Pillars of Continuous Testing and more TGNS101

Posted on 11/13/2023

About This Episode: Want to know a new open-source solution that brings AI ...