About this DevOps Toolchain Episode:
Discover how to revolutionize software development by integrating AI into the DevOps cycle. In this episode of DevOps Toolchain, we host a riveting discussion with Chris Navrides of DevTool.AI and Jason Arbon of Testers.ai. They delve into how AI is increasing efficiency, aiding in coding, debugging, testing, and deployment while discussing the importance of humans in quality control. The conversation also covers the initial challenges of incorporating AI, the need for new developer skills, and the ethical considerations in AI-enabled software development. Tune in to explore the future of AI in DevOps.
TestGuild DevOps Toolchain Exclusive Sponsor
SmartBear believes it’s time to give developers and testers the bigger picture. Every team could use a shorter path to great software, so whatever stage of the testing process you’re at, they have a tool to give you the visibility and quality you need. Make sure each release is better than the last – go to smartbear.com to get a free trial instantly with any of their awesome test tools.
About Chris Navrides
Chris Navrides is the CEO of Dev Tools AI, working to help developers not have to fix broken and flaky UI tests. Prior to Dev Tools he was the VP of engineer at test.ai, which is built the first AI based mobile & web testing solution. Before that, Chris worked at Google on automation testing for Google Play and mobile ads, and Dropbox on their mobile platform team. Chris received his Bachelors and masters from Colorado School of Mines.
Connect with Chris Navrides
- Company: www.dev-tools.ai
- LinkedIn: www.linkedin.com/in/chris-navrides
- YouTube: UCuWA4yupjJ5YzMzEC-tUJfw
- Git: dev-tools-ai
Connect with Jason Arbon
Jason is the Founder of TestersAI, author of the book How Google Tests Software, frequent speaker, and all-around AI and Testing expert.
- Company: testersai.com
Rate and Review TestGuild DevOps Toolchain Podcast
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:01] Get ready to discover some of the most actionable DevOps techniques and tooling, including performance and reliability for some of the world's smartest engineers. Hey, I'm Joe Colantonio, host of the DevOps Toolchain Podcast and my goal is to help you create DevOps toolchain awesomeness.
[00:00:17] Joe Colantonio Hey, it's Joe, welcome to another episode of the Test Guild DevOps Toolchain Podcast. Today we will be talking with Chris and Jason all about rethinking the DevOps cycle with AI and probably a whole bunch of other DevOps AI-type topics. If you don't know, Chris is the CEO of DevToolsAI working to help developers not having to fix broken and flake UI tests. Prior to DevTools, he was the VP of Engineering at TestAI, which built the first AI model-based mobile and web testing solution. Before that, Chris worked at Google as an automation engineer for Google Play and mobile ads in Dropbox on their mobile platform team. Also, Jason is a repeat guest here as well, and he's the founder of TestAI, author of the book How Google Test Software, a frequent speaker, and all around A.I. in testing expert, He's one of the few people I actually really listen to when they talk about A.I. in testing because there's a lot of misinformation out there, and I think we're going to set a lot of things straight. So you want to stay all the way to the end to hear all this awesomeness.
[00:01:17] This episode is brought to you by SmartBear. As businesses demand more and more from software, the jobs of development teams get hotter and hotter. They're expected to deliver fast and flawlessly, but too often they're missing the vital context to connect each step of the process, that's how SmartBear helps. Wherever you are in the testing lifecycle, they have a tool to give you a new level of visibility in automation, so you can unleash your development team's full awesomeness. They offer free trials for all the tools. No credit cards are required. And even back it up with their responsive award-winning support team. Showing your path to great software. Learn more at SmartBear.com today.
[00:01:59] Joe Colantonio Hey, Chris and Jason welcome back to the Guild.
[00:02:06] Chris Navrides Hey, thanks for having us.
[00:02:07] Jason Arbon Thanks, Joe.
[00:02:08] Joe Colantonio Good to have you back. I always botch bios so is there anything I missed in your bio that you want the Guild to know more about?
[00:02:15] Jason Arbon Yeah, just the failure of my company. Other than that, No. No. No, really. Yeah. We weren't able to raise a B round. That's another podcast I'm happy to share about the detail actually. But can we raise a B round? But I got really creative. Started a new company called Testers.AI.
[00:02:32] Joe Colantonio Oh! Testers.AI. Nice. Nice.
[00:02:34] Jason Arbon Yeah. So still focused on AI/ML and the same old stuff.
[00:02:39] Joe Colantonio Sweet.
[00:02:40] Jason Arbon Actually same old stuff all new technology.
[00:02:42] Joe Colantonio Oh, nice. Perfect. Even better. How about you, Chris?
[00:02:45] Chris Navrides Yeah. Still plugging away. And we have a new product as well that does AI-based code reviews. And then we're kind of looking now with the advent of large language models we're looking at where we can apply those within the DevOps toolchain and what kind of that means in terms of where are we going to be developing and how are we going to be developing in the next two years or so.
[00:03:07] Joe Colantonio Nice. I would think the code review fits in nicely to the DevOps toolchain, helping developers. When you have the DevOps toolchain, a lot of time people think of what AI think of functional automation while this may because that's mainly where I'm from. So where do you see AI then now impacting actually DevOps or DevOps Engineers?
[00:03:26] Chris Navrides Yeah, I think from a DevOps standpoint AI is amazing the large language models are able to look at your logs, synthesize and write up reports for you on those. So traditionally you'd have to train a model specifically based on your data. Now it's able to just suck in the data, read it human-like, and be able to then summarize and report on that. And so I think that's really the biggest impact that we'll see in the DevOps side with your monitoring anyways, I think you'll also see where it can be applied with coding code review. I think it'll just kind of affect each of those chains individually and it'll be interesting to see over the next year or two where each of those gets kind of disrupted. But I do believe all of those areas will be disrupted with these large language models like Open AI.
[00:04:11] Joe Colantonio Jason, what do you think? How it applies to DevOps?
[00:04:15] Jason Arbon Faster than we realize. I think we'll see a lot of the roles that people play being automated by stuff powered by LM because the cool thing is that it can act like a person. Like it starts to sound like a person these days. It's not just a neural network that you classify cat, dog, or hot dog can actually think like a person and then people. So if you look around us, we're doing things like auto GPT, right? Where you can have like high-level tasks broken down to subtasks, passed out, and farmed out to other bots, and they can even roll up the results. That starts to sound like a DevOps team, starts to sound like an engineering team, and then things with like you can start to imagine, you produce the code, right? Like the AI piece of the code every last day and says, Oh, it's not that great of code and there's bugs in it and stuff, but guess what you can do? You can now run that code via that chat interface, find the breaks, pass those breaks back to the original LM and it can fix them. So that starts again, sounds like a DevOps test kind of loop there too. I think we'll see the automation of what traditionally is human handoff and Slack and the API hard coded APIs will be handed off to more natural language automation flows in the next 18 months.
[00:05:19] Joe Colantonio I'm just assuming when people hear that and they're kind of freaking out, like what? It's going to replace DevOps. Do you see replacing people? Is it just replacing titles with new titles?
[00:05:29] Chris Navrides You want for the clickbait?
[00:05:32] Joe Colantonio Clickbait.
[00:05:36] Jason Arbon I'll say both are true. There's this really good condition I watched last week where someone talked about just automation gen forget A.I. and all these other buzzwords. But the funny thing is when you automate things like everyone thinks they're going to be as cost savings because once you automate it, there are no humans needed. The reality is, in almost every kind of general automation product, there was an unformal study, I think related to. Is that once things become more efficient and automated, you do more of that. Imagine in the next 12-18 months it won't be like a sudden mass loss of jobs. It'll be people are just shipping five times faster, five times more often, and with five times kind of more confidence is really what I think will happen in the near term. But ultimately, yeah, I mean you see what this is going. It's exponential, right? So I think anyone who predicts that more than 18 to 24 months is confused, actually, or just disingenuous at worst.
[00:06:28] Chris Navrides Yeah, I'd like to just add on to that as a valid point but I think very similarly because ultimately the thing is these companies are profitable today. And so if you can have one engineer in five x them and you're already profitable with five engineers who's one x, you can get five x, the future's done now. And I don't think there's a real incentive for most companies if they are in a competitive space and not just trying to cost cut to cut your engineers. And if their competitors are doing the same, then it just will amplify that gap that they would have to close. And I think that you can't really fire an engineer right now and assume that AI would take over. But you can start to assume that these engineers are going to be far more efficient and that's going to be, I think, the key value add.
[00:07:13] Joe Colantonio So I think that's key is being efficient maybe over-optimistic is that I heard a presentation where if an engineer doesn't take advantage of AI, they'll be replaced. But an engineer with AI is only going to be more productive. And this is what I saw with automation in the early days, testers that didn't invest in automation and they just did just straight-up testing. Eventually, they were replaced by people that actually use automation tools as well. I see that's kind of similar, but I don't know. It seems like AI is a different kind of beast. Am I being too optimistic with my thoughts are?
[00:07:45] Chris Navrides No, I think I mean, there are still testers today, right, With all the automation tools, all the automation advances, there are still a lot of open positions for manual testers. And I think there's going to be some mythical hurdles or high-level hurdles that will have to be overcome that right now are not solved. And it's unclear how achievable it is or at what cost it would be solvable. But things like being able to take in the code base, taken the whole context of the code base. That's I think large enterprises, they have massive code bases that you have to understand before you can write a line of code. LMs don't have that ability to understand this specific code base. I think the other aspect of it is trust, right? So there are these articles like Samsung saying their code leaked because their engineers are using ChatGPT and so there's a lack of trust of A.I., where's my data going? So these models are also going to have to be able to be siloed so that an engineering or a company can trust it. And so I think you have to kind of overcome some of these hurdles. And to Jason's point, it's unclear right now, how much context can be loaded in. Right now, it's 8000 tokens. It'll soon be 32,000 tokens. But like, the cost is I think double to just do 32,000 tokens. But the number of tokens you need to make this work for a large code base is going to be millions of tokens. And so it's like what does that cost and is that worth human salary, unclear. I think a lot of what you see today is is all started with new projects. So I haven't seen other GPT with an existing project and add on to it. I'm sure it'll happen soon, but they're going to have these obstacles to overcome to add that value. So that's, I think, going to be the real challenge that we'll see. But when you see that, I think then it starts to become a real question.
[00:09:32] Jason Arbon Yeah, I think I wasn't asked but I'll jump in is we'll see the reverse of what happened with before. Like there's no I wouldn't even know. I think I'll try to look at the data and I think it's real pretty sure. Yeah, I verified it once. There are more tellers today than there were when ATMs were invented, and that's the other population growth and other stuff and other factors into it. But it wasn't devastating. And there's also, I think, more horses today than there were cars. I believe that's at least for me. And so new cars were invented. But I think it's going to eat the different types of people. It's going to eat the middle, the entry to middle-level engineers. So I think we see it and actually it started hard to tell things with a macroeconomic or not, but I think you'll see a lot of slowdown in hiring of junior folks, I think across the board, regardless of discipline, because the grunt junk work can't be handed off. Your other hand to ChatGPT than an intern or than are fresh out of school because you can control it. It works whenever you are working and make it work in your style. I mean, for example, like I'm writing probably twice as many blog posts as I used to with ChatGPT. And I also don't ask Chris for feedback on them.
[00:10:38] Chris Navrides The quality is going up.
[00:10:39] Jason Arbon With ChatGPT, it is going up. And then the other thing is, I think I think we're starting to still see though, like Chris said, when you get to that larger scale thinking, like the design architectural kind of thinking, you can't fit that many tokens into an LM today. But things are getting better with the reinforcement learning were. If I ask it for like to analyze like a giant chunk of code, I can give it a bunch of pieces and I'll add something to it. It'll add to it, but it will truncate and abbreviate subsection so it can fit in the token box. Right. So and like to go manually merge that maybe Chris will figure out how to disperse that for me soon. But there's still a lot of room I think, to grow in terms of scaling up, not just with tokens with methods and procedures to scale it up to pretty good size in near future.
[00:11:31] Chris Navrides Yeah, just to add on, like I think to Jason's point, I've already seen in this, I've talked to friends who are at large tech companies and they were saying the amount of time they're spending with new college hires interns is dropping because they just personally don't see the value. It's like if they're asking a new engineer, Hey, can you go do this function? They spend time explaining, explain, explain. And then eventually, this person gave up and was like, I'm just going to see if I can use ChatGPT within 30 minutes they were able to get the function that they were trying to describe and iterate to this intern. And it's like, I think those are the things that as a junior engineer or a junior person, you need that mentorship. And I think now with ChatGPT, if it's all about just the functional export of code, how much code can I write? You're going to lose. I think some of that human side component for the junior engineers, while your senior engineers are going to probably get a lot more exposure and a lot more responsibilities because they can handle it now and execute at such a high level with a ChatGPT on their side.
[00:12:30] Joe Colantonio Chris, you did write an article, I believe, on DevOps with AI, I think you called the stack and is this what you mean when you allow developers to get higher-level input by leveraging AI to write code and then doing other things. Can you explain stacking I guess within the DevOps cycle?
[00:12:45] Chris Navrides I'm trying to recall the exact stacking aspect.
[00:12:48] Jason Arbon I can say one thing while Chris reiterates what he wants. I think that, yeah, everybody's top-level in this. Even if you feel secure in senior-level engineering. Because the weird thing is, once you get beyond the simple functional things that Chris is saying, you actually have to be when the thing breaks or gives you the wrong code, usually it gives a wrong code, it gives you code for an older version of some API or something. But if it can't fix itself or understand how to merge it into existing order Codebases is actually a real high-level skill. Like that's actually the hard part of engineering and making sure you're not incorporating new bugs and you can review and understand what was generating code is doing. It's a higher-level function, actually extremely the up level. So there's a lot less room with even less far less room at the bottom rungs for junior engineers to be value add. They're dangerous, I think actually more so these days.
[00:13:43] Chris Navrides Stacking just a technical concept with that. I wasn't listening to what Jason was saying necessarily on that. I assume it was very, very deep and insightful.
[00:13:52] Jason Arbon Orthogonal.
[00:13:53] Chris Navrides Orthogonal. Same. Same. Right. But yeah, so stacking is really just a technique where basically you can generate like a memory, right? So traditionally with these GPT models, the memory is just whatever you've chatted into it in the past. But the memory can be these larger scale code bases, for instance, right? And so people are starting to play in that space of like, Hey, could you bring in context from the code base? Search the code base, find the relevant context that might be in there and try and break the test down and component ties it down to that level. It's early days on it. We'll see. I have not yet seen this work in demos. Maybe if someone has it working there, keeping a secret because they've offloaded their job. But I think that's going to be a concept whereby essentially that you give it a high level task and then you can kind of stack up the tasks within it and it can just kind of use the RL agent to just reinforcement learning agent, I should say, and use that to then execute GPT to figure out what it needs to do, take the memory, understand what it is already there in the code base or what's in this code file, pass that context to GPT and kind of keep the slip going and tell it completes.
[00:15:02] Jason Arbon You, actually. Okay, Chris, you getting all worked up is that the. I think that there's, we're also talking about what's happening like right now. By the time this is pushed to the web, this may be irrelevant. Some of it. There are already papers coming out in the last couple of weeks, even last week, where people are getting like the complexity in CPU cost goes up exponentially relative to the number of tokens. But I've seen some papers come out where they're actually making it sub-linear. So meaning, you can actually maybe get a million tokens and maybe if you just waited instead of like 2 seconds, if you waited like 10 seconds, you can have a huge trade-off in terms of the amount of tokens it can be processed by the LM. We might actually I won't make it that here. But I'm a little optimistic on this. I think that in the model size is getting smaller actually too, so they'll fit on the phones soon. And that also means less memory, less compute to execute them. So I think that we'll probably laugh because maybe in 12 to 18 months I say that it's probably 6 months that we'll be able to fit entire generally decent-sized code bases in the tokens here pretty soon. It seems to really work to make that happen. And that's the huge bottleneck we're focusing on right now. And I think that it's so we'll probably solve that problem next 12 months.
[00:16:17] Joe Colantonio In DevOps, I mean, then we'll have even more frequent releases because the amount of work that could be done for us automatically.
[00:16:23] Chris Navrides There is, I think, an entire world where you could basically just have you can imagine a world. And I don't think it's crazy at this point to talk like this, but a world where you basically say you're optimizing for revenue or you're optimizing for clicks on an article or podcast, right? And then it just goes and you have a whole agent that's constantly generating new variants, testing those out in production without any human input. And just the entire optimization is how many clicks, how many people can I get to buy this thing, right? And they can configure the buy now button on Amazon page for colors automatically. They can think of different ways to say add now or buy now. You can give it the flexibility of here's the entire code for the front end, tweak whatever you want, and optimize the back end releases. If it's you think that it's slow on the database, look up time, right? Spend time optimizing that. So yeah, I think it releases are going to go faster. But I think that there's a potential in a world where the releases themselves can just be fully autonomous, where you just give it this task, you give it a goal and say optimize for X, and it's just a continuous release that's constantly testing itself. No humans in the loop at all. And as long as you give it the kind of parameters those guardrails of like don't crash, optimize for this thing, like it'll just go and it'll find the best path. And I think that's starting to become more of a reality or a probability.
[00:17:50] Joe Colantonio So are you saying the only goal of the DevOps engineer then at that point is just setting parameters?
[00:17:54] Chris Navrides Potentially, yeah. Setting up logging, setting up the core infrastructure to actually allow it to monitor those parameters. Right. So if you gave it, you integrate open telemetry, you integrate all the API call times, you give it access to the finalized metric of subscriptions or purchases or whatever. It could then handle a lot of those. You can look and monitor the logs and say, Hey, I did this push. Now I'm seeing a bunch of crashes on the backend. Let me roll that back. There's I think, a world where if you have enough of these agents, those agents can then self-optimize. That's my personal thought anyways on the matter.
[00:18:33] Joe Colantonio Jason?
[00:18:34] Jason Arbon There you can tell, you know me enough, Joe.
[00:18:34] Joe Colantonio Eyes one the bug you like.
[00:18:41] Jason Arbon Just that's a problem advance. The doctors were like it's okay. Good thing my kids it's not it's not much.
[00:18:55] Joe Colantonio That's right.
[00:18:55] Jason Arbon Kids are much so. But I think two things that are important I think one is that I think Chris is right about this kind of autonomous new layer being introduced. And it's not that ridiculous because we used to have a bunch of DBAs database administrators. Where are they? It's because we've had higher-level tools and become standardized and commoditized. And the things he's talking about are there things that we used to worry about with databases, right? It just now it's more complex because it computes storage and memory distribution. So I think it's very plausible. That's what's happening right now. Companies starting they're working on that right now. But I think the simplest, most tactical thing to think about it, to make this in terms of LMs have impact on things today is you use even like Chris said this Chris just said, oh here's a bunch of we've seen a bunch of crashes go trying to fix it right? that loop I don't think it's going to be the place for the DevOps because there's a bunch of crashes. It's just going to look at the logs and there's a bit in a crazy thing because I think people are using LMs like this is worth at least $100. I think this little meme here and that is that people looking LMs to do what they do today, right? Like they're trying to reproduce the same flows that they have today, the same processes, and they're used to having a function that they have to code the heuristics into that looks at the data, right? They're trying to put those heuristics into ChatGPT or these LMs you're saying, hey, if this and this and this and this is bad and that and then analyze the data I give you, I think it's funny. The LMs are actually far more capable of reasoning than people give them credit for. I'm still discovering it myself. You can just say, here's a bunch of my app engine logs or here's a bunch of MySQL logs. Is there anything wrong? You can just ask it that. And guess what? It comes back with intelligent analysis, even a REST call and just say like, is there anything wrong? You wouldn't do it. Say things like, It looks like there's a password in the clear, right? And you go like, oh, it can tell you things that you didn't even know to look for. I think we'll transition to that eventually. But we're in the mode right now where everybody's reproducing their functional code in the LMs, but I think they need to once they can skip a level and skip a step and just say, hey, what's wrong with this stuff? And it will give you very interesting information back. You don't need to hardcode everything.
[00:21:11] Joe Colantonio What do you think about the adoption of AI in DevOps, though? No, because you're all so close to it. It seems like, yeah, everyone's doing it. But even with automation, I know there are still companies that have no automation in place. For AI, like how do you see the adoption of AI in DevOps? Is that something that you think the tool vendors will start incorporating? So but just by that, people that already use tools will then have AI or is it going to require someone to initiate incorporating AI into the DevOps pipeline? And if that makes sense.
[00:21:40] Chris Navrides Yeah, Jason, you'll go this time? I'll be polite and gentleman.
[00:21:45] Jason Arbon Yes. You can then correct me after. Yeah, I think it's the same story with the power of A.I. is that it's generalist. It means generalized. And so your problem with test automation was that everything was hardcoded to one product, right? The power of A.I. is that it can work on arbitrarily different number of products and it has context across products. It can be reused. I think the big advantage of each of the vendors, the data dogs, the Splunk start kind of of the world, and people just get the magic of A.I. because they have a subscription and it will just appear in their tooling. I think similarly to testing, I actually advise people not to learn AI like the testing vendor world, another story for another podcast, but it's not something it's very difficult to learn, very difficult to deploy, very difficult and expensive to maintain. And the reality is only like the Google drop boxes and those kinds of people in the world can really deal with the complexity of maintaining an AI system, I think, in production, because not only is it complex and outside of the core value add, but vendors to do a cheaper scale. But because you actually can't find these machine learning engineers on the planet if they don't exist, even Google has a hard time finding enough AI engineers for all their products like literally, they don't have enough. The odds that some bank in the middle of America is going to have access to some great AI engineers to deal with all the complexity themselves is very low in the near future. Short answer vendors I think is the major adoption and where to go.
[00:23:10] Chris Navrides Yeah, I think similar. I think the vendors are going to have the better location for it to exist. So I can imagine like a Datadog or Splunk or something being able to add ChatGPT that's trained on your custom data automatically with the click of a button, but it's already integrated in all of those components over, say, a DevOps engineer supposed to learn AI, understand it, how to rebuild all the stuff that say, Splunk data or what have you has done. Just integrate the AI. I think there's low-hanging fruit today that people can use like LM, like Jason was saying, you can call ChatGPT with your app engine logs and it can tell you, hey, your passwords in the clear, there's might be an issue. That's awesome. But then there's the entire productization around that. And that I think is okay, well it's easy to kind of get that MVP, but then to make it scale and make it stable, make sure it can handle the volume, make sure you can do it at an efficient cost. Like, those are the sorts of things where it's like it starts to break down outside of that initial piece. And I think the vendors are going to spend the time to really figure out how to make this scale. I think the interesting part to me as I think about this and traverse everything is not necessarily that the world is today, but kind of going back again what we were talking about earlier, like the world of tomorrow. So everyone right now is thinking how does A.I. and LMs and opening AI affects my products services work of today? But it's like if you kind of map out from where we are today to where we're going in that world, like if everyone was already using A.I. to build their products, A.I. to maintain and scale their revenue and those aspects, what would the DevOps world need to look out for that? What does that world look like from a coding perspective, from a product perspective, right? What does that look like if now, instead of a team having launching ten features this month, they can launch 100 features or a thousand features, what does that do to the whole people side of that? Like, do you need 1,000 PMs or will LMs take over the PM job too? If you're 10 exing the number of features, 100 exing the number of features you can launch, does that mean these are ten x the number of DevOps engineers that you launched today or make them ten X more efficient before you could even launch those ten X products? So I feel like it's a broad front that's going to have to move forward. But I think if you project out like things are going to have to change. If everyone's launching ten x more stuff and then the ability to do 10 x more stuff in the product world, then it means that you're going to have to have 10 x more server capacity or 10 x more monitoring capabilities. And those I think, are going to be the next living factors.
[00:25:51] Joe Colantonio You both mentioned how junior developers and junior DevOps engineers seem to be getting replaced. What skills then should a DevOps engineer focusing on? And Jason mentioned, don't even think about AI because it's so complicated, What should people be doing then? And I know you said it's foolish to think 18 months, but I mean, are there any skills you see that will be will endure throughout the years as a DevOps engineer?
[00:26:15] Chris Navrides Podcast?
[00:26:15] Joe Colantonio Yeah, I hope so.
[00:26:19] Jason Arbon You're a genius, Joe. We laugh, but you'll be the guy that shuts the lights off.
[00:26:27] Joe Colantonio That's right.
[00:26:29] Jason Arbon Going on in the in the room. You're like.
[00:26:32] Chris Navrides Well, one human and probably 1000 bots.
[00:26:34] Joe Colantonio Right, right. Yeah, yeah. That's right. Yeah.
[00:27:51] Joe Colantonio Everyone's in the same boat. So if some reason Dev replaces all jobs. And if it replaces all DevOps jobs to replace lawyer jobs, everything and other jobs. So something would have to come up. Maybe some sort of universal wage or something? I don't know.
[00:28:05] Jason Arbon Yeah. There'll be a lot of podcasts. No, I think that universal income stuff is somewhat rational. Remember when I was working early on TestAI, Chris remember this, I'm nuts. But like, I was definitely the lawyers didn't like it in the boardroom, but I wanted to give equity to the bots and also the tax on them that was refunded back to the humans through bonuses. I think need to figure out schemes, actually figure out some schemes like that. I think, sorry, this is way above my pay grade, but if you really look the trend of history in technology like almost the majority of the US economy is services based, it'll just be more services so everyone will get their nails done, Joe, not just you. And we'll watch a lot more Netflix auto-generated Netflix with whatever characters who want it. I think it'll just be gravitating toward a world of leisure. And I believe it's a little bit naive I believe also toward surplus net net. People be more creative and have more fun like rather than get up in the morning and worry about those servers going down. You might think about new topologies, new compute paradigms, or new ways to maybe you spend all your day programming, but you're programming your entertainment, right? And making it personalized. Personalizing all these services may be a chore toward the future, but it won't happen overnight. But it's I think it's already hitting industry today like that.
[00:29:29] Chris Navrides Yeah, I think somewhat similar where I think there's definitely a world where, yeah, issues of today where it's like you have to have a job to meet ends meet and it has to be a high enough paying job that specialized might change, right? The more creative art type jobs, passion, jobs might start to become real value out. I think there are things that help the community and volunteer work might become a real thing as well.
[00:29:54] Jason Arbon Apocalyptic for, like my-It might be a happier Podcast.
[00:30:01] Chris Navrides I think the other thing to think about is like if you kind of look in the technology space, right, as things have been more democratized, it also has allowed for smaller entrance and more niche areas to come up and sprout up right before like, say, AWS came out, you had to start a website, you had to have a server and pay for an entire server rack and all that. Now you can start Website, a server running in a cloud that's always on for five bucks a month. It really lowers that barrier to entry. And this might do the same if you wanted to go, you have an idea for an app that you want to go build that does X, Y or Z, you can go ahead and now build that with ChatGPT without having to spend months learning Android or iOS. To Jason's point, you don't have to learn these new languages. You can kind of go ahead and execute that vision without having to have that resource and all of that amount of time that you would say spend to learn that. I think there's a possibility that maybe it's in the short or medium term, but you'll see, I think, an influx of creativity and an influx of different services and apps that come out. The general kind of low-level generic type of positions maybe are less interesting, but that will allow people to specialize more in interesting and creative ways that we just don't necessarily know about yet. I think there's a bit of that. And so if I were kind of the initial question, if I were a junior engineer, I think focusing on understanding that the technology, understanding the business, understanding users and how to think through that process and that architecture level aspect of what's the roadmap, how to think, break a big problem down into sort of manageable chunks. That's I think still going to be key to success, at least in the short to medium term. Once you get it down to a small enough level, right, AI can automatically handle that. So if you break it down to a method level and you write a description, it can generate that code for you at that method level. And I think we'll just start working up to next you'll be able to write a class for you and then it'll be able to write microservices and this and that. But if you kind of keep that mentality right now of, Hey, how can I, what is the core business need? What's interesting to me? What am I willing to spend time on and learn about? Those are kind of the key things. And then this is a powerful tool that will let you accelerate that, take that idea into a product.
[00:32:25] Joe Colantonio Great. So there's one other question I want to ask. I think, Jason, you recently kind of touched on this on your latest post. As AI becomes more and more involved in things like DevOps. How can companies make sure, there are ethical considerations and human values are not compromised, that the models are not biased, they're not doing bad things that they actually are doing what they should be doing. The last question, but like if you could sum up, how can that be done?
[00:32:52] Jason Arbon Why failed? Because my main takeaway from all that is that my main point on that is actually that I would say is that all models are biased. Like, they just are like by nature, they're by like the only unbiased A.I. system is one that's not been trained yet. You're biasing by training it. There are good and bad biases. And that's an opinionated thing. It's also any specific business-specific thing often, too, Right. And this can be good business-specific biases and stuff. So like, it's more about understanding what biases you want and do not want the system. And I think time evolves. You're just like being open. I think the right this is actually a very, we've only got one minute, but I'd like to save the planet for AI and save DevOps from AI. The most important thing to do is around reinforcement learning. I think in terms of guiding the A.I. to behave the way you want to, and we'll see a bunch of this coming out in the next six months and you take the core models and you personalize them, or you customize them using reinforcement learning and other techniques to make them conform to your biases. But there are going to be biases. The question is, this is part of that leveling up, right in the DevOps person. Today some of the smartest DevOps people are just going like, I don't like crashes and I like efficiency and I like, failover, right? Like they need to up level and say like, what level of, what's cost-benefit ratio is there for crashes or for bad connections to my infrastructure or what is tolerance of this business, for example in at different times of the day or different times the week or shopping, Christmas shopping season or something. So it's going to be applying those policies and then making sure that the AI conforms to those is going to be a problem. Again, I think it's going to be a collaboration between that DevOps engineer and the infrastructure that they have at hand. Like their Datadog and all these other things. It's going to be a collaboration between them. Like I think, Kevin Powell actually said this last week, I think pretty well. He just called it like it's AI partnership. It's not like AI or a human. And it's not just even collaboration because really the AI collaborate know. We're still in charge, but we're partnering with those A.I. systems and partnering with those vendors to impose our whim on them. Right? We want all of our DevOps tools to actually act like Netflix that they know we don't like horror shows and they know we like cartoons. So that was a little too much. But like my kids use my Netflix, so it's awesome.
[00:35:13] Chris Navrides We got very personal there.
[00:35:15] Jason Arbon Especially, Bananas and Pajamas, by the way, is awesome. So but you want to be personalized, there's a lot of work in that, but it's out of the drudgery and it's more it's a higher level thing, but you have to come up with your biases that you do want to eliminate bias and help impose that will on the A.I.
[00:35:35] Chris Navrides The human intelligence layer, I think, will still need to be applied to this where you can look at it, look at that system, and actually say, this is correct, this is incorrect. But there's a theoretical question I have, which is like, is there such a thing as no bias? Because if you were in the DevOps world, right, if you were to optimize for the servers to be able to support this new feature, that requires high latency in doing that, and making that goal happen, you're now biasing against people who have lower latency, right? Maybe more in the rural areas of the country, let's say where there isn't as good a connection. Was something like this. But if you're on like 3G driving, right, you're biasing against that person because you're optimizing for the user experience of someone who's on a Wi-Fi connection. I think that's something where it's like similar to what Jason saying and Kevin saying as well as like, yeah, I think you need to have a human layer human in the loop on that to be able to then help guide it and figure out what the business needs are and what the core competencies of that is.
[00:36:42] Jason Arbon Yeah. And if you don't need that human loop, you're commoditized and a lot of stuff will be commoditized. Like you can see at some point, the AI will just know like, guess what? These are the optimal preferences for a streaming service or for an email service or for whatever kind of business you are. The DevOps are done to provide biases that are just non-obvious and nonstandard and business specific.
[00:37:05] Joe Colantonio Cool. Okay, guys, before we go, is there one quick actual piece of advice you can give to someone to help them with their DevOps AI efforts and what's the best way to find or contact you? Chris, we'll start with you.
[00:37:14] Chris Navrides I found the fetal position as you think about this and just rocking back and forth is helpful. But no, I think yeah, that's a good question. To me, don't get too freaked out about it. I've had the kind of joke I talked to a lot of founders, a lot of people in the AI space, and it's someone's like, I'm really freaking out having a panic attack about this. And the joke is, Oh, is that your first one? Like your first panic attack this? Because every single time something big comes out, you think, oh, that's the stuff. And then you kind of can calm it down and realize that there's this stuff. So I think, don't freak out and just realize this is a tool just like any other tool. It can really help accelerate your career, it doesn't have to be, you don't have to view it negatively. It has a lot of really good benefits to society, the world. You personally, if you know how to use it, leverage it, and it can allow you, if you have a side project that you've always wanted, it can help accelerate and make you overcome that obstacle of I don't have that resource to do it. It can be that resource, if you wanted to write a book, it can help you write a book and, really help out or a new app or this or that. It has that capability. Viewing it, I think more positively than fearful of my job is going to be disrupted because of this here. It's going to be here. And trying to accept it and understand how to use it, I think a much better use case than trying to fear it.
[00:38:35] Joe Colantonio Awesome, Jason?
[00:38:36] Jason Arbon You missed a book plug opportunity.
[00:38:40] Chris Navrides Yeah. Check out my new book coming on Amazon.
[00:38:43] Jason Arbon No, Joe, got a new book coming out.
[00:38:44] Chris Navrides Oh, yeah. Sorry.
[00:38:48] Jason Arbon I'm going to see what Chris said. So but I think very technically, I'd say two things. One is on a very technical front, the thing I think people are not doing today across just DevOps in general too, is people are looking for anecdotal evidence of AI is good or bad, it's boolean and they're looking for like anecdotal evidence. I would say if you're trying to measure the goodness or badness of any AI kind of based system at all, AI software or products built on top of it, if you're not doing sampling, you're very likely doing it wrong. And so think about how to do that. And that's really like measurement of scale. If you're not doing that, then you really don't understand the nuances of the system that you're measuring. And I would add also, the best way to push AI and automation forward is to get an enterprise license for Testers.AI
[00:39:38] And for links of everything of value, we covered in this DevOps toolchain show. Head on over to TestGuild.com/p113 and while you are there, make sure to click on the SmartBear link and learn all about Smartbare's, awesome solutions to give you the visibility you need to do the great software that's SmartBear.com. That's it for this episode of the DevOps Toolchain show, I'm Joe. My mission is to help you succeed in creating end-to-end full-stack DevOps toolchain awesomeness. As always, test everything and keep the good. Cheers.
[00:40:10] Hey, thanks again for listening. If you're not already part of our awesome community of 27,000 of the smartest testers, DevOps, and automation professionals in the world, we'd love to have you join the FAM at Testguild.com and if you're in the DevOps automation software testing space or you're a test tool provider and want to offer real-world value that can improve the skills or solve a problem for the Guild community. I love to hear from you head on over to testguild.info And let's make it happen.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.