About this DevOps Toolchain Episode:
Today, we have an exciting episode for you. We're joined by Naveen Krishnan, a Microsoft solution Architect who's passionate about AI and cloud computing. Naveen's diverse career spans working with startups in India, consulting in the US, and bringing his expertise to Microsoft in 2021.
In today's episode, we'll explore how Naveen leverages AI tools like Microsoft Bing Copilot and Office 365 Copilot, the intricacies of generative AI, and the pivotal role AI plays in enhancing Azure features. We'll also dive into large vs. small language models, the functionalities of Microsoft Fabric, and how AI transforms the developer experience, from testing and security to code generation.
And if you're curious about running language models locally or maximizing AI in your workflow, Naveen has some invaluable insights to share. Listen up!
About Naveen Krishnan
Naveen Krishnan renowned for his passion and innovation in Technology. His expertise in AI is not merely a profession but a driving force for exploring and advancing this dynamic field. Naveen is known for designing systems that boost operational efficiency and open new avenues. His AI work is deeply rooted in a commitment to harnessing its capabilities to address real-world challenges, making technology accessible and advantageous for everyone.
Connect with Naveen Krishnan
- Blog: www.AIWithNaveenKrishnan
- LinkedIn: www.navintkr
Rate and Review TestGuild DevOps Toolchain Podcast
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page. Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
[00:00:01] Get ready to discover some of the most actionable DevOps techniques and tooling, including performance and reliability for some of the world's smartest engineers. Hey, I'm Joe Colantonio, host of the DevOps Toolchain Podcast and my goal is to help you create DevOps toolchain awesomeness.
[00:00:19] Hey, today, we'll be taking you on a grand tour of AI and DevOps with Naveen Kumar Krishnan. If you don't know, Naveen is a solution architect at Microsoft. He has a deep passion for artificial intelligence and cloud computing as you'll find out in this interview. He also brings a wealth of knowledge and experience to the table that I think you can get a lot of value from. He's here to share his insights on the latest trends in A.I., how it helps you with development and AI and all the good things and the transformative power of Azure. and how these technologies really are shaping the future of development and all things AI related. You don't want to miss this episode. Check it out.
[00:00:56] Joe Colantonio Hey Naveen, welcome to The Guild.
[00:01:00] Naveen Kumar Krishnan Hey, Joe.
[00:01:01] Joe Colantonio Good to have you. I guess before we get into it. Just curious to know, how did you get into A.I.?
[00:01:05] Naveen Kumar Krishnan Yeah, it's a long journey, actually. My interest in math and couple of science brought me here. I joined. I started my career 15 years back. And I worked for several startup companies back in India. And in 2015, I came to U.S for a consulting company and after 2015 to 2020, I was looking for several enterprises. I was designing two systems on prem. And then 2021 is when I got into Microsoft and I started working for Cloud. Basically, I started on the last one of years I'm into AI.
[00:01:36] Joe Colantonio Awesome. So at Microsoft, how often do you actually use A.I. to help you with your day to day job?
[00:01:42] Naveen Kumar Krishnan Well, at least once in half an hour.
[00:01:44] Joe Colantonio Wow. Really? How?
[00:01:46] Naveen Kumar Krishnan Yeah. Yes. I run copilots available. We do use Microsoft Bing Copilot as well as Offices copilot on. Yeah, maybe. Yes. That's how it is.
[00:01:58] Joe Colantonio Nice. So normally when I think of A.I., especially nowadays, it's more generative A.I.. I thought maybe you could just tell us what is generative A.I. before we dive deeper into all the other things that make it up?
[00:02:09] Naveen Kumar Krishnan Yeah, sure. Generative AI is basically simple and so it's so big. Generative AI going to be like if you give it any questions and it kind of does the vectorization and then it kind of maps all the relative words, but it can relate what what we're asking for. And then it could run some algorithm behind the scenes. And then it gets you the answers for. And there are like 6 or 7 steps which it has to go through before it can generate and transform and produce the data, what you're asking for. That's basically the Generative AI is. And apart from Generative AI, there are a lot of other AI things that are like AI for vision and AI for face recognition and other stuff, which is like kind of pre-trained basically where we train the model for our use case and then it's going to help us for that specific use case, but Generative AI is something which it can generate on all what you asked for. There are billions and billions of parameter goes to that model. And that model can know like what we're asking for. It can go scrape the web and get the content for you. These are the things what I can see from Generative AI is like.
[00:03:16] Joe Colantonio Now, at Microsoft, are you actually baking in A.I. to Azure like, is that part of your activity or is that do you have anything to do with actually adding features to the cloud or Azure or Microsoft with the A.I.?
[00:03:29] Naveen Kumar Krishnan No, it's a mixed responsibility actually. I worked with customers and identify use cases and I come back and then enhance what with the internal and then get some features rolled out. This is what I do.
[00:03:44] Joe Colantonio Nice, just curious to know, as a developer, as an engineer, how do you test AI, to make sure it's working as you expect, especially as a company at Microsoft? I don't know how they do it before they roll out to customers. Obviously you want to test it. Is that something you do or know any insights around?
[00:03:59] Naveen Kumar Krishnan No. So if for compressing perspective, like how do we test AI is like there are a lot of capabilities which comes with Azure open AI. So it's not just you do the model or, you do the fine tuning, but it's not just your responsibility stops here, right? First, identify a model which you want to use. And once you fix the model, like for my use case, there are several types of models. One is SLM or LLM or multimodal models. The first try to fix the model which best suits for your use case once a model is being done and then you can go for the pattern like what pattern you're going to use this model for. One is RAG or Retrieval-augmented generation or otherwise you go for find a function calling or you go for any other types of pattern model what do we want the system to be. And if need, you want to integrate with vision, or you want to integrate with text to speech capabilities. Based on the use case again. Once it's all finalized, then you are going to find your model. And once you find your model. And that's when the model can work more for your use cases. Because until you fine tune, it's going to be a generalized model. The fine tuning is a place variable where you kind of give that data and then see how it should behave in these types of scenarios. And once the fine tuning is complete, maybe you'll run the results where you'll fit in the model data, not the actual live data, feed in the model data and then model it and the sense fit it in the actual response to the actual response what you're expecting and the question of what are you going to ask. Those 2 combined together and output. What you get from the model is what it tells you, like what level of accuracy can do that. Azure Open AI, what I have seen is like within like a few clicks, I think you should be able to get the result of how your model is going to perform with all the data in hand.
[00:05:52] Joe Colantonio Nice. So you mentioned a few words. I just want to make sure everyone understands what they are. One of them is I think you mentioned multimodal models. I know this year that's been kind of where Generative AI went and went more and multimodal. Can you talk a little bit about what is multimodal models and maybe how it helps developers more than maybe what they have experience with maybe a year or so ago?
[00:06:15] Naveen Kumar Krishnan Yeah, sure. From the year ago or so, when ChatGPT started and just does you ask the question and then it gives you the data back. That's what we started. After like three, four months of that got evolved to vision. So what you can do is you can upload an image and then they can ask it to do this. So giving the image and giving your data all together is going to be the multimodal model because it can create, it has evolved a lot. And today what you see out here is three things that we can see or it can read and it can peer at the same time, so that for the multimodal model is basically so seeing the senses, what image, whatever you upload, you can get data from it. And whatever you type in, it graphs the data from it, it can read and whatever you talk, it's going to get that. All put them together and you can do all these 3 simultaneously. That's the beauty of having the latest recent release of GPT 4.0 called this capability. And so many details and it can help so many health care and other for other domains basically to get good lot of problems like what else. There are so many things what these types of domains need basically, especially retail and especially retail. They need this particular capability. So that's how the GPT 4.0 is going to solve this problem.
[00:07:39] Joe Colantonio Now, it's pretty powerful. I mean, I use it for stupid things like someone had a letter, they didn't know what language was written. And so I took a picture of it. I uploaded to ChatGPT and it translated it and tell me what language it was. Pretty cool. Also, I'm terrible with names. I'll just say, How do I say this name? It will have a voice where it actually shows me how to see a name. How to use it as a developer that like I don't do hardcore development anymore. Just curious to know, do you ever use, sometimes people want to know, if you like can upload an image of your application and ask, how is this from a user of perspective? Any advice on UX? Can you do those types of things?
[00:08:13] Naveen Kumar Krishnan Yeah, sure, definitely. From the developer perspective, how it works is basically, I have a design diagram. This done like around 1998 or 2000. I have the diagram with table schema and stuff like this huge. And then I just upload that and then I say like what it? It's going to tell you the clear picture of what technology has been used and how it was designed and how it is running and things like that. And also if I add more additional details about the product itself, then it's going to tell what the current product is doing, how better it can evolve and things like that. That's the beauty of it. How developer can leverage those capabilities.
[00:08:55] Joe Colantonio Awesome. So in your developer process and DevOps, how well it's used Generative AI. I think you mentioned helps you with coding, but when you check in your code, do you have any like AI running that can help let you know like how well the build is doing or anything like that?
[00:09:09] Naveen Kumar Krishnan Yeah, sure. But especially a lot of integrations are with GitHub. If you are using GitHub primarily because I use a lot of Github things. First it starts with security, first it kind of identifies all the if there are any vulnerabilities or any dependencies that we have attached to the project, as got some vulnerable pieces in it, so it's going to identify that. And then we have a lot of other bots and other things which kind of add more value to this piece of code analysis. It kind of does a code analysis like whether the code is reused by how many percentage of this code is being reused and how many percentage of this code is nearly written. And how this code can be put to waste because initial days like around two, three years back, we spent a lot of time doing code reviews and other stuff will all this Automated AI coming into picture which can recommend you how the code needs to be. And based on that, if you correct and then provided the right feedback next time, it's going to learn from that. That's how I'm seeing that. It's just for the Github and there a lot of other things apart from Github right good value, put your artifacts. It's kind of does the scan and give you the report of how the artifact sizes and what other external dependencies, what we are working on and how this can be cut down and how we are artifact can be very small, safe and readily available. These types of things where I consider but artifact resources like Jfrog and others like I have not seen much of AI infused into that product yet, but I'm expecting a lot very soon.
[00:10:43] Joe Colantonio Nice. When I was Automation architect, I used to have a lot of code reviews for 8 to 10 sprint teams and they always mocked me as a blocker because I had so much code to review. That's a great use case of using as for code reviews. That's really cool. There's anything you see how you can get the most out of using AI? Like maybe like I sometimes see things on LinkedIn where developers or tester says, it doesn't work or it's hallucinating. But sometimes I think they're not really leveraging it as best maybe I'm wrong, but any tips for how you get the most out of Generative AI? Not only return your information, but it's returned to really correct information or really useful information.
[00:11:21] Naveen Kumar Krishnan Yeah. Yes. One thing is first it starts from the place where actually developed the IDE, it start from IDE where you can add integrate with Gihub copilot and it's kind of help you with all the pieces of code you don't have to tell it, you don't have to write all the pieces and everything. So they can create a class, functions, metrics, and everything for you. And it's very more like it's not like ..... It's not going to do that because as you say, the logic and it is intelligent enough to put the right words for variables, functions, and parameters. And also things of the reusable cases, like how it can be same class can be reused and where and how. These types of things also have been added with Github copilot. Github copilot with your IDE and you write the right code and then you send the code. And once the code reviews and then once it scans for all of the checks and everything, and once the code passes all the guardrails, what you've been set on the Github side. When it's all done, it's going to be taken to the next level that it can run to Sonar scans and other stuff on our code. I don't think that is any new. I haven't checked the Sonar's latest capabilities of adding infusing AI into the product itself, but if there are any, I don't know their roadmap. If that product has got a lot of use cases like that, you can infuse AI on Sonar once the Sonar scans, does all this .... and other stuff. Then artifact and then going to publish once you publish your artifacts into your environment and then you do the face rollout and they can also help you there to know how the system is working well and how the users are doing and whether this application is whatever features or capabilities that you added is going to be at label or is it having some additional bugs whether needs to be a bug fix created for the project What you are done, whatever deployed, and also you can also report you like how well it has went and how far so many issues and things like that. It's very broad. It's very end-to-end. And I don't think this study minutes is enough to discuss the complete specifics of each and every capabilities. What is that's got into it.
[00:13:38] Joe Colantonio Gotcha. All right. So I also have my notes, large language models and small language models. I thought maybe you can kind of enlighten us between the difference and when you choose one over the other. I guess what is a large language model compared to a small language model?
[00:13:51] Naveen Kumar Krishnan Yeah, sure. It's a good question. I've got this question from several of my friends, colleagues, and others. How do I choose and when should I choose a LLM and when should I choose SLM. For example, I'll tell you some use cases, right? The specific for example, I'm creating a chat bot and the chat bot current capability is, for example, I have my application and I'm getting like around 15K calls every day for the support for my application, for example, I have a washer dryer or I have a product which is running on edge or on the end. And I get a lot of support cases. How do I automate that before even customer comes in and ask me a question? First, try to solve that problem before I even raise a case and even after they raise the case or they want to send any product details, then where should I go and check? These types of things and I have my knowledge base is very huge and it's not just simple, right? I also I have some futuristic view, right? If somebody shows me the defect product and I should say, like you can go out and buy this part and then you can fix it by DIY. If you have these types of views and if you have these types of broad roadmap for the particular supporting and your support spend is very huge today and you want to get on to support spend by enabling these capabilities and making it more user friendly. In that case, I would strongly suggest go for LLMs because LLMs is trained with a lot of parameters and it can go into details and it can do a lot of creative stuff. Also, it can add more value to it. Like for example, if the knowledge base has got very minimal information on how to just tie a knot, then LLM has this capability on based on the details, what it has got and based on the user question it can frame something which is meaningful for the end user. In that case, you can go for LLM. But for SLM, what I would say is if you have some specific use case, for example, I have a solution where I can talk in natural language and for example, reporting, let's take some reporting capability. I have this bunch of reports for my C level time. Every enhancement my C level or my marketing team want to see on their report. They come to me and then they ask me and then I understand their use case first and then kind of build that report for them. From the time they asked to the time where to see the report is going to be one and a half months time frame for the report to be given to the marketing team or the field which is too huge, too much of time doing this. Why can't I automate? If I ask something natural language, how many products has been sold for this particular area, for this particular quality, on this particular event? Alright, so in that case, if I wanted to do some natural language to SQL queries. In that case I don't need LLM because my task is very specific and I have my table schema and what I wanted to all going to do is identify which table is the right one and it's going to frame single query out of it. In that case, I would say GPT 4. What we need should be able to solve this or else by three or any other small language models can help you solve this problem. And one other thing is cost, right? When you render LLM/large language model, the cost is going to be very huge. And when you come to the SLMs stuff, it's cheaper than LLM. In that case, you don't need to burn the energy, what you don't really need. Basically, I would say go for SLM in that case. And a lot of NGOs and other nonprofit organization, they want to use AI for like email tracking and content generation and things like that. They are more towards SLM these days for cost effective business.
[00:17:45] Joe Colantonio Nice. What else do people ask you about that once you explain the difference, did you ever get any follow ups? Like, okay, I know the difference between the two, but any other more common questions you get asked around?
[00:17:56] Naveen Kumar Krishnan Yeah. One other thing. What I have seen is how do I run these SLMs into my local laptop. The SLMs is very small, can I run this in my laptop? How do I do that? So far there is no enough documents available for serious cases. For those cases, what I do is open up whatever questions I get very common, I try to blog them. It's on medium.com AI with Naveen Krishnan where you'll see all my blogs. If you wanted to run a model called PI3 on your local. How do you use LLama, CLI tool to run your PI3. How do you run GPT 4.0 on your local? These types of information, it's there in my blog you can get benefit out of.
[00:18:38] Joe Colantonio And we'll have the link for your blog in the comments down below. Okay. Another question, I have in my notes Microsoft Fabric with AI, I actually haven't heard of this before. And since you work at Microsoft, hey, you're the perfect person to ask. What is Microsoft's fabric with A.I?
[00:18:54] Naveen Kumar Krishnan Yeah, so Microsoft fabric with AI. That's one thing like even I am like you-how I learn Microsoft fabric is just like six to eight months back. I have to present to one of my team, one of my friend's team. I explained to them like what Fabric is. So that's when I went in depth and then I started learning. Basically, what fabric is analytics platform where It's built in with data warehousing and one leg a lot of capabilities and one other key feature is you don't have to move your data or assets. So if it's stored on different storage systems, like ...... anywhere outside of Azure and you can have your data asset and you can have connectors and other capabilities to get data from. You don't have to duplicate the data and pay for the same data. That's a major feature of fabric what I see. And when talking about AI integration. One thing when I was trying and then one thing which impressed me a lot is as a developer and if you open a Jupyter notebook on fabric and then you don't have to write any code. So you just tell it like, this is what I want and I can develop that for you. And you can see 10 to 15 lines of code for that piece. For example, that is Excel, can you read that Excel and then store it in this. And I wanted to get to this. It can do this. That's the copilot what fabric has caught, so you should definitely try this once.
[00:20:22] Joe Colantonio All right. Some people here that might be a little scared and I don't know if this is Click B, I just saw something on LinkedIn. I can't find it now where a company's audio was leaked saying that's going to replace the developers with AI next year. What do you think about that? Can developers be replaced by A.I? Should people be afraid after hearing, Hey, I just use fabric and it'll create an app for me?
[00:20:46] Naveen Kumar Krishnan My personal view, it's just artificial intelligence, right? It's not AGI or something. With this artificial intelligence, it's going to help you do more. And from whatever you've been doing. For example, for a feature to go out or go live, you've been spending one and a half months with all this process, with the DevOps process and your testing and then automated testing, pen testing, and other stages you have to cross. And it's going to take that long. With this capability, it's going to bring that time to life basically. That's what it's going to do to stop it. It's just someone you can see as something you can get help from. It's not going to do your job right. You are just going to ask to something and then it's going to give it for you. And when the search and things rolled out and it didn't take any job in the past. So it's the same.
[00:21:47] Joe Colantonio With AI, it's always moving is always changing. How do you keep up to date? If someone's listening and they're like, how do I even get started learning this? Obviously, you'll learn this. Do you have a process that you recommend, you start a small project. How do you stay up to date all the time with what's happening with the AI?
[00:22:04] Naveen Kumar Krishnan The first thing of so how to start getting into AI. I've seen like I don't know how many of you have not tried copilot or ChatGPT. Most of us like 99.9% of us tried that. After trying that, we all will have some assumption how it is and what it is doing behind the scenes. First, clear those assumptions, that's the first step to get into AI. Know what is happening behind the scenes. That's where I would start basically, once you get that understanding, then AI is very easy. Then first do a quick certification on Azure Open AI or Azure AI itself where it covers both the cognitive services as well as the Open AI pieces of it. So this kind of gives you a glimpse of what is ML and what is AI and other stuff. Once you get that certification, maybe, or you can go read all the models about what is 3.5 ... and what the 4.0 ..... Know more about the model. That's what I see the AI. As soon as the new model comes out, read thoroughly all the capabilities of what this model can do and which model is going to get repaired as soon as this model comes out and how it is different from the previous model. And if you have time and if you are a tech person then I would say do a hand on once and see how this model behaves for all your ask. You may have a few questions like throw them on to the model up and see how that works. That's how I learned so far.
[00:23:37] Joe Colantonio Awesome. Okay Naveen, before we go, is there one piece of actionable advice you can give to a developer or someone to help them with their A.I DevOps efforts? And what's the best way to find or contact you?
[00:23:48] Naveen Kumar Krishnan You can find me on LinkedIn, so I'm available and I'm actually on LinkedIn. So that's the only place where I collaborate with my friends and colleagues. So that's one thing. And one other thing small piece of advice what I would recommend is AI is not complex and AI is not costly. AI is very easy compared to other stuff around here. Learn AI. It's going to help you, for sure.
[00:24:13] For links of everything of value we covered in this DevOps Toolchain Show. Head on over to Testguild.com/p167. So that's it for this episode of the DevOps Toolchain Show. I'm Joe, my mission is to help you succeed in creating end--to-end full stack DevOps toolchain awesomeness. As always, test everything and keep the good. Cheers!
[00:24:36] Hey, thank you for tuning in. It's incredible to connect with close to 400,000 followers across all our platforms and over 40,000 email subscribers who are at the forefront of automation, testing, and DevOps. If you haven't yet, join our vibrant community at TestGuild.com where you become part of our elite circle driving innovation, software testing, and automation. And if you're a tool provider or have a service looking to empower our guild with solutions that elevate skills and tackle real world challenges, we're excited to collaborate. Visit TestGuild.info to explore how we can create transformative experiences together. Let's push the boundaries of what we can achieve.
[00:25:20] Oh, the Test Guild Automation Testing podcast. Oh, the Test Guild Automation Testing podcast. With lutes and lyres, the bards began their song. A tune of knowledge, a melody of code. Through the air it spread, like wildfire through the land. Guiding testers, showing them the secrets to behold.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.