From the Trenches to the Boardroom: Making QE Matter to the C-Suite with Parasar Saha

By Test Guild
  • Share:
Join the Guild for FREE
Parasar Saha TestGuild Automation Feature

About This Episode:

In this episode, Parasar Saha founder and CEO of Digy4 shares insights from his journeys across multiple countries, meeting with global enterprises to uncover real-world challenges teams face in achieving quality at scale.

The discussion dives into the importance of data visibility, the pitfalls of focusing on the sheer quantity of tests over their real value, and how testers can transform technical data into business outcomes that resonate with the C-suite.

Parasar reveals why most organizations struggle with fragmented tools, overwhelming data, and telling a compelling story about quality's business impact—even in the age of AI.

Whether you're looking to elevate the role of QE in your organization, make your testing results more meaningful, or simply want to know how to get a seat at the executive table, this conversation is packed with practical advice and thought-provoking strategies.

Tune in and discover how measuring the right things, and telling the right story,can take your automation efforts (and your career) to the next level!

Exclusive Sponsor

Discover TestGuild – a vibrant community of over 40k of the world's most innovative and dedicated Automation testers. This dynamic collective is at the forefront of the industry, curating and sharing the most effective tools, cutting-edge software, profound knowledge, and unparalleled services specifically for test automation.

We believe in collaboration and value the power of collective knowledge. If you're as passionate about automation testing as we are and have a solution, tool, or service that can enhance the skills of our members or address a critical problem, we want to hear from you.

Take the first step towards transforming your and our community's future. Check out our done-for-you services awareness and lead generation demand packages, and let's explore the awesome possibilities together now https://testguild.com/mediakit

About Parasar Saha

Parasar Saha

Parasar Saha is the Founder & CEO of Digy4, Canada’s leading AI-powered enterprise platform for quality engineering and real-time reporting. With 23+ years in QE leadership across airlines, finance, and telecom, he previously led major IT transformation initiatives in the Canadian airline industry.
Parasar is also a founding member of the Canadian Quality and Testing Association (CQTA).
He is passionate about making engineering data actionable and visible to business leaders.

Conroy Allen is an accomplished technology and digital transformation leader with deep expertise in enterprise software development, quality engineering, and strategic delivery.
He formerly led digital applications and transformation initiatives at Air Canada, where he helped modernize customer experiences across web, mobile, and loyalty channels. Currently, he leads Customer Success at Digy4, partnering with Fortune 500 enterprises to align AI-powered quality engineering solutions with real-world delivery challenges. Conroy is passionate about driving measurable impact through governance, visibility, and continuous improvement across the software lifecycle.

Connect with Parasar Saha

Rate and Review TestGuild

Thanks again for listening to the show. If it has helped you in any way, shape, or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.

[00:00:35] Today, we'll be talking with Parasar all about From the Trenches to the Boardroom, Making QE Matter to the C-suite. Parasar is the founder and CEO of Digy4, Canada's leading AI-powered enterprise platform with quality engineering and real-time reporting. He has over 23 years of experience in QE leadership across airlines, finance, and telecom, and he previously led a major IT transformation initiatives in the Canadian airline industry. Really knows his stuff. It's his third time on the show. You don't want to miss it. Check it out.

[00:01:05] Joe Colantonio Hey Parasar, welcome back to The Guild.

[00:01:09] Parasar Saha Thank you, Joe, for having me.

[00:01:11] Joe Colantonio Great to have you. It's been a while. What have you been up to since the last time we talked? I know you've launched a company and you're doing really well. I don't know if you've expanded to anything else. What have you all been up too?

[00:01:21] Parasar Saha Great question. First of all, this is my third time in your podcast. I really enjoy this and how I see it is like, what I started with, like, in 2020 about Universal Cloud. Then I talked about how I was starting my journey in the industry as an entrepreneur, and then now I'm here. I love to talk about like, recently, like, as an entrepreneur, we scaled the company up. We're working with several global enterprises. One of the things I do is coming from the industry is to understand what the problem is. What is the challenge that the leaders, the practitioners, the mid-management is facing and then taking back to that problem statement to my team and see how we can help them. I just have some numbers, some people are kind of really surprised by this. Like last 30 days, I've been in six countries, 11 cities, and 21 flights. The reason I did it is to kind of meet people in person, understand what their real challenges in quality and different segment, different domain. And then taking it back to my team and taking the conversation forward for a solution here.

[00:02:39] Joe Colantonio Love it. All right. So what's the main theme you've been hearing? Does it depend on the vertical? Is it all the same?

[00:02:45] Parasar Saha We do find a pattern and we are very focused in what we do. We are focused in data visibility AI. And our AI is a very specific area of the AI. AI has a lot of things and we can deep dive into that. We call it Insight AI. We are really specific in what do and the conversations that I had around the globe with different domains is to try to understand the area that we work into, how that is impacting quality, which is reporting, analytics, why it is important, and how this can elevate QE to become just not a group in the organization, but a place in the boardroom where they can actually make the board room and see people understand the value of quality and the work that we do. How do we elevate that conversation? There is a pattern in that. A lot of people are struggling in that same space. And that's what we kind of my focus is right now.

[00:03:49] Joe Colantonio I'm glad you're on the call. I just had a recent conversation. I don't know if this is normal. I think it is where he has a mandate that they need to use AI, start using, implementing AI. And so this is from the executive team down to their team. They need to start using AI. And so he was looking for ways to write tests quicker and think like things like that. And we were saying, well, and we said, how about your requirements? He's like, oh, we just based on the code, we write tests. And we said, you're probably aren't going to get really value from it unless you have really good requirements. It's like management isn't going be into that because they're going to have to be involved. Is that a common thing you see where businesses don't top management doesn't necessarily care other than just write more tests? Like, how do you build the knowledge up to let them know it's not just building more tests, but building the right tests and why it needs to be involved from the beginning and there needs to be some startup investment?

[00:04:44] Parasar Saha What you said is something that we see in the industry very often because from like a senior management perspective, like if you look at the different persona, QE, usually goes like in an organization, mostly you will see that the highest level is director. VP is a very uncommon, like there are few organizations with VPs. SVPs are very uncommon in the quality world. So the people who are giving this direction are beyond QE. They are beyond QE, but the QE roles up to them. In one way, their thought process is like, if I get more tests, then yes, it's better for us. I'm going to have more coverage. That's not the truth. And with AI coming in, the common understanding in that space, like who are beyond the way, the volume will bring them the quality. It's not that. AI can create a volume, but what is more important is that the impactful test. How do you figure out when you're creating tests, which tests are more important? And when you are running those tests, which tests that are more valuable for you? Which is finding you more defects or which is going to be like being the guardrails but that's most important. Because if you look at it, like with AI coming in, you're going to scaling up from a, let's say 500 test suite organization to a 5,000 test suite organization. Because you can create N number of with the AI. But again, if you keep running those things, you're going to end up having at least 10 to 20% failure. So your failure rate has become 500 failures. What happens at the end of the day is you end up in a kind of a spiral of following up some things and you're not actually making quality faster. The way you can make quality faster is to find what is your impactful test. What test you should be focusing on and what test you'll be running when a change happen in the ecosystem. But that impact analysis is a very big part of the thing and that information you can mostly gather from your data and reporting and the thing. You can measure and you can manage things well.

[00:07:05] Joe Colantonio How does someone in the C-suite receive that message, though, especially like when you hear that, you're like, oh, I'm not a regulated environment. I'm not a health care. I'm not insurance company. Big deal. Do they say big deal or like how do you-like what are the pain points? Is your brand could be impacted if this doesn't launch right or you can lose money? Like how do you approach the C suite with like why they need to be really heavily invested in this?

[00:07:31] Parasar Saha Yeah. So I'll talk about two things here. I had several conversations with people who are at the ground level doing the thing, which may be practitioner, mid-level senior managers or a director, they talks are about operational data points which is how many tests are automated, how many defects you have fought. They are talking about that thing. But when you are a like C level guy, even me, when I am actually running a company of myself, what do I care about? That's what I also think about. What do I get about? What keeps me awake in the night? What keeps me awake in night is not the lower environment. What keeps awake in a night is my production. What is my customer seeing? What I really care about is how much disruption I'm having in the production. Am I having this many bugs? How does it impact my customer, customer retention, customer experience, sentiment analysis, those things I mostly care about. When I'm at C-SU, because that's very much tied to my revenue and customer. And industry is driven by customer, retention, revenue, because investors are going to look at that. Now, as the leaders of quality, if we can, like, yes, this data is good, like how much we have automated and the stuff. But If you can funnel this data, this KPIs that you are building with your organization and roll it up to OKRs, what is helping the business, how you are helping the business, then your data becomes a gold mine for a C people because they can see like, hey, I did this work and I invested here and this helped me to keep my production safe. This helped me to build my brand value that and my customer is not coming to my critical things and failing to operate this, my one business transaction failed, the customer is going away. If you can show that data and can tell that story, that's a way powerful story for a C people rather than just trying to figure out how many automations and stuff that you, you have to tie those things together. And I can tell you, talking to the C level, where these also get challenged sometime is that the data they're getting, 56% of the C people fails, that the data that they receive is out of their data. They can't do anything with their data, because there is so much volume by the time that you actually present the data to a C people, that data has become still. So you have to give them real-time data, meaningful data, and that can make more revenue, more customer attention for them. You have to tell that story.

[00:10:27] Joe Colantonio How does a tester tell that story?

[00:10:29] Parasar Saha Yes.

[00:10:29] Joe Colantonio Well, again, like I just like writing tests. I have thousands of tests and maybe they're not doing the right thing. Maybe that they think they're doing the right thing, but they're not, like you said, they not testing the right things. They have thousands of tests, but not the right test. How do they tie it together knowing what the KPIs that matter to QE and make sure that tests are tied that way and that they're reporting it up in a timely manner.

[00:10:52] Parasar Saha It's a very complex thing, okay, and tell you why. 90% of the organizations in the world has gone through digital transformation. What digital transformation has brought the shift is people move from single-digit number of softwares in their software engineering cycle to double-digits. There is too much of tools, and these tools don't talk to each other, like seamlessly. What happens is like right now, you know how much data we have as of 2025 like generating? It is 180 gigabyte of data. One gigabyte is like one trillion gigabyte of data and that was a 50% increase from 2023 only. Imagine the volume of data that you are seeing and the sources it is coming from. When you talk about reporting, it's a very complex thing because you're on-site doing this data extraction, which is a complex thing. Then you have to do this data modeling and the data visualization. You are sitting into a stack of very detailed or technical stuff. A lot of times there are challenges to do. What I would suggest is like lean into your larger organization reporting platform, like how you can use them. If you see there is an opportunity to work with a buy solution that you can do that because it's a very full scale solution. Get to a solution which can get you real time reporting and have a conversation with the set of people who are not only in QE. Your main customer is your people who are outside of QE. Engineering, business, what do they want to see and if you can show them that and how you're helping that, you're moving the needle. Don't just show them like how much automation you ran, how many tests you ran. That doesn't give you anything. That gives you yes, that gives you power to see but what they want to see is that how you are saving them from a sorry phase to the customer or an investor.

[00:13:09] Joe Colantonio So that's a good point. I guess, with AI, it's becoming even more like a test, easy to write. And if you don't show this other value that you're talking about, you could be let go, not because you're not needed, but because you didn't tell the story or bubble up to more important things that the C-suite really wants.

[00:13:30] Parasar Saha Yeah. That's exactly, and I had discussions with people that they are buying tools. The tools are not cheap these days. When you go to this AI tool, then investing millions of dollars, and the organization is holding people accountable how you are actually using that million dollars that we have spent. Okay. Unless you can prove that you can't go for the next million that you want that. So they're facing that struggle how to get the funding also. And the data is the only way you can actually tell that story to the management what the like, you know writing the checks for you that okay how you're getting right? It is something like, if you can tell that story that helps you to get more funding, more visibility, and you can be sitting in the boardroom actually like giving quality and giving your team the visibility and the right, the kind of significance that they deserve. As a leader, you have to make it visible.

[00:14:37] Joe Colantonio In the age, there's no shortage right now of AI tools with the modern pipelines and everything. So why are people still struggling with data? Isn't this what machine learning was created for, for being able to analyze a lot of data? So why wasn't this the first thing that was solved? Why is it still an issue?

[00:14:53] Parasar Saha Yeah, so machine learning is there. Machine learning can process a huge volume of data, but getting everything integrated, and if you think about it, like in a large organization, let's say a bank or a telco. Now, they have more than 20, 25 .... teams, like which are agile teams. They use like more than 10, 15 tools in their ecosystem. Even like for the same purpose, they may be using multiple tools. This comes from fragmentation, like the fragmentation that you have in terms of tools. Your data is coming from multiple sources. Now, if you show, hey, my Cypress data is this, Selenium data is this, API data is, that's not what people want to see. What they want to do is see how much of the coverage is done. How much of your things are protected, defects you're finding, how many defects you are finding from your testing, those are the things. That fragmentation, you have to keep it behind the scene. So that part has to be engineering, and that engineering, that integration is not done in the industry that much. And most of the time, organizations are falling back to their in-house capabilities to build that engineering which is taking them time. It's not scalable and not working. That's one of the reasons that fragmentation and then again, because you are living in an agile world, every part team have their own ways of doing the stuff, many organizations, they don't kind of dictate the terms that you have to do this. They just let the teams to do that. What happens is even in the same way, their data is in different format. But you need a system who can actually build it together and feed the thing. And then you can use AI to do pattern matching and see what the problem is and give it to your management to see what it is.

[00:16:52] Joe Colantonio All right. How do they do that? I mean, you mentioned inside AI, is this something that could help with this type of fragmentation? Is that where we're heading?

[00:17:01] Parasar Saha The first part, how do I do it, had a .... form over the years, over five years and talking to industry, we built out something called, one of our flagship product is Digy Dashboard. Digy dashboard, what we do is, we mimic, if you remember the conversation, I always say that, look at the broader ecosystem, not only QE. In the broader ecosystem, there's a concept called integration hub, like the sales force. Everybody has this integration hub where you integrate everything. We build that integration hub. No matter what tool we can integrate into one, then a universal data pipeline, which can talk to different type of tool irrespective of what type it is the thing. Then you kind of consolidate the data and then kind of like feed it to the AI piece. That's the thing. We build a whole platform which can help towards that piece. And again, there are other ways also to do it, you can do it like nothing, but it will take you a lot of time if you do it Power BI. That's one of the thing and using these data to tell the story is the most important part. You have to know what you are because we find this gap lot of time even the C people doesn't know what to ask for. Now, this is where the community factor is also going to be important. Like, if you look at the DevOps, they have something called DORA metrics. But in QE, we didn't set the path, what are the proper metrics that we should be doing. Every organization is trying to build their own metrics. It's a time when we should becoming together and talking about what is the measurement that we can give from QE. And that will set the golden standards, that will help a lot of conversations to make easier, and expectations will be much clearer in the path.

[00:18:56] Joe Colantonio What are they? Is that something that's built in the tool, it gives you a common set of key metrics you've seen have been helpful over the five years or so you've worked with all these different companies?

[00:19:04] Parasar Saha Yes, we did that with the companies we worked with. We see a pattern. We see that they are looking for like escape metrics. They're looking for the release trends. We see. And they look at rework metrics. There is a set of metrics they usually look for. We worked it out, like because we are heavily focused in this area. We try to figure out what the industry world, what is the pattern that we are seeing. And we are taking this conversation. But this conversation has to grow as a community conversation as well. And that's why I also have this community forum called the CQTA, you know, the Canadian Quality and Testing Association. As a whole community, we have to agree that how you measure QE. Once you've set that standard, that it's much easier for leaders, practitioners, and everybody is driving towards one goal. End of the day, organization's goals are pretty similar and quality goals are similar. We have to align and figure out what is that DORA metrics for us, for QE, that we can champion that thing.

[00:20:16] Joe Colantonio Love it. But say someone had the key metrics that they're in agreement with, that they're going to use. They have a hub in place to correlate all this data. You mentioned one of the issues is having the data fresh on time. How do you not, are there any blind spots in creating a dashboard or alerts that make sure that your management is getting alerted on the right data in a timely fashion and not getting false data? I don't know if that makes sense. A lot of times it might be over inundated. How do know, like, what's the right level of information to give them? And what the most timely manner is.

[00:20:46] Parasar Saha Yeah, and this is something, it's a very good question because through our journey, we have seen this, that data can be sometimes overwhelming because you're getting too much data to manage metrics. The way to build it is persona-based metrics. Every persona has a specific need. If you talk about like escape metrics to like, or revenue metrics to engineer that may not be that meaningful for you. For they are looking for some engineering metrics. But for different people, it's different. How do you stack it up is for engineering metrics you have something, for a mid management or a project management level, you have a set of metrics and then another set of metrics for the executive. Your dashboard should grow with you and whichever level you are, you should have a separate set of things which talks to their needs.

[00:21:45] Joe Colantonio Absolutely. How do you get the setup like how do you get a seat at the table, though, with your CFOs and CEOs like as part of a process, you're not just an afterthought, you testing software, they don't care. How do become part of the whole process where they know about you and you're reporting to them like quarterly or weekly or whatever? How do you see that work?

[00:22:05] Parasar Saha Before I transitioned into an entrepreneur, I have worked with a lot of sales people. Try to understand because, I mean, when you're building a business, sales is a very important part of it. I always observed how they are actually influencing the key business people, like all the leaders of the organization to achieve what they want to achieve. And that is by real-time data. If you see, they give a very real-time data with sales force and things. One side it is pulling you like your sales data. One side, it pulls you your revenue, your things. It's a very connected ecosystem. And imagine you are sitting into like, your SVP and who is also facing a sales guy and he opens up like, a fancy Salesforce dashboard and you open up like an Excel. Where do you stand. You have your data like you got it probably three weeks back or that guy got the data like, real time and feeding into from every system, their ..... and everything. That's where I think it's very important to kind of get that every system connected and you have to make sure that data is coming from the ground. But if your data is kind of like some handcrafted data, then what happens is the trust on the data is low, and 45% of the C people don't trust the data if it's manually kind of, like, curated, and it goes into a different. So, it has to be ground data. It has to real-time data, and it needs to be a simplified metrics that talks to the need of what the people want on the table. And that's where it has to talk to your revenue. It has to talk to customer and how you are helping retain revenue or gain revenue. If you cannot tell that story, that boardroom is not going to listen to you.

[00:24:04] Joe Colantonio Love it. So if someone's listening to this and they're a QA or QE, or whatever, or a tester, how do they know where to start? Because you said, how do you know what you don't know within your organization? So to get make sure you are collecting the right data, especially the monetary data to make it like the big picture to be seen. You're not missing segments here or there and slices that add up once you have them all in the proper place.

[00:24:28] Parasar Saha The process is simple. You've got to talk to people outside of QE as well. Ask them, how can I help? When you ask that question, the person will tell, okay, this is how you can help. How do you measure that, okay what I'm giving you is helpful for you? If you have that conversation, it will be helpful. If you only talk to inside QA, you won't ever get the solution. That's what I always think about. If you see my travel patterns, like even my travels. I'm going to like 30% of my travels into QE conferences, and I going rest of the conferences are non QE conferences. I'm trying to understand how they are doing, what the value QE. I was in a conference yesterday, it's a CX conference, which is like a contact center conference of AWS and people were talking about so much changes happening in contact center in with AI coming in and how QE can help. When you kind of sit into this conversation, then you understand what are they looking for, you can break into that. Lot of time, that is what we are missing. We are very inward-looking organizations we are beginning. Now, how we open up our shell and we open to people what they are looking for. And even I would want to see in our QE conferences, how do we bring in C-level guys and tell them what are we looking for? Not just kind of like, yeah, this is my AI use cases, but what are they looking for the AI use cases that can help. That conversation, that breach will help us to reach a wider audience and make QE valuable to organization.

[00:26:13] Joe Colantonio Great advice. OK, Parasar, before we go, is there one piece of actionable advice you can give to someone to help them with their QE automation efforts? And what's the best way to find or contact you?

[00:26:22] Parasar Saha Actually, you know what? I will quote what I heard yesterday from one of the leaders. And it actually is sticking to me even now. Measure to manage. You should be able to measure yourself. If you cannot measure, you cannot manage yourself. And this is the exact word another leader said in yesterday's conference I was. So I want to mention that. And if you want to reach me, you just Google me, you will find me. I'm on LinkedIn, like feel free to send me a DM, I will be responding to you. And my email is parasar.saha.digy4.com

[00:27:03] And we'll have links to all these in the comments down below.

[00:27:06] Thanks again for your automation awesomeness. The links of everything we value we covered in this episode. Head in over to testguild.com/a553. And if the show has helped you in any way, why not rate it and review it in iTunes? Reviews really help in the rankings of the show and I read each and every one of them. So that's it for this episode of the Test Guild Automation Podcast. I'm Joe, my mission is to help you succeed with creating end-to-end, full-stack automation awesomeness. As always, test everything and keep the good. Cheers.

[00:27:40] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.

[00:28:23] Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Two men are featured in a promotional image for "TestGuild Automation Testing" discussing Playwright and AI in QA, with the text "with Ben Fellows.

Playwright, Cursor & AI in QA with Ben Fellows

Posted on 08/31/2025

About This Episode: In this episode of the TestGuild podcast, Joe Colantonio sits ...

Why AI + DevSecOps Is the Future of Software Security

Why AI + DevSecOps Is the Future of Software Security With Patrick J. Quilter Jr

Posted on 08/27/2025

About this DevOps Toolchain Episode: Support the show – try out Insight Hub ...

A man with glasses and a beard speaks animatedly into a microphone. Text reads "TestGuild News Show: Weekly DevOps Automation, Performance Testing, and AI Reliability. Breaking News.

Playwright MCP, Cypress FHIR API, AI Test Management and More TGNS167

Posted on 08/25/2025

About This Episode: Is AI the future of Test management? Have you seen ...