Learning Library

← Back to Library

Avoiding the AI Pilot Graveyard

Sections

Full Transcript

# Avoiding the AI Pilot Graveyard **Source:** [https://www.youtube.com/watch?v=BmajUTU3gGU](https://www.youtube.com/watch?v=BmajUTU3gGU) **Duration:** 00:36:41 ## Sections - [00:00:00](https://www.youtube.com/watch?v=BmajUTU3gGU&t=0s) **Big Models and AI Pilot Failure** - The host advises starting with the largest model then compressing it, cites Harvard Business Review's claim that 80% of AI pilots flop, and introduces Roblox AI VP Anupam Singh, a two‑time big‑data founder, to discuss what makes AI pilots succeed or die. - [00:03:06](https://www.youtube.com/watch?v=BmajUTU3gGU&t=186s) **Demo vs Production in AI** - The speaker explains how impressive AI demos—often called “party‑trick models”—can mislead stakeholders into thinking a finished product exists, causing irrational enthusiasm and confusion between prototype outputs and the much larger effort required for a usable, production‑ready solution. - [00:06:14](https://www.youtube.com/watch?v=BmajUTU3gGU&t=374s) **Balancing Scale and Cost in AI** - The speaker highlights how massive user engagement can lead to unexpectedly high token and cloud expenses, urging teams to select suitably sized models rather than defaulting to the largest possible ones to keep AI projects financially sustainable. - [00:09:19](https://www.youtube.com/watch?v=BmajUTU3gGU&t=559s) **Finding AI Talent: Data Focus** - The speaker stresses that scaling AI efforts hinges on securing engineers skilled in data preparation and vector databases—abilities that are rarer and more critical than traditional model‑training expertise. - [00:12:27](https://www.youtube.com/watch?v=BmajUTU3gGU&t=747s) **Roblox AI Enhances Creation, Users, Safety** - The speaker explains how Roblox uses AI to streamline game development, auto‑generate high‑quality avatars, enforce safety through moderation, and introduce voice capabilities for both creators and players. - [00:15:34](https://www.youtube.com/watch?v=BmajUTU3gGU&t=934s) **From Driverless Cars to Code‑Free Worlds** - The speaker marvels at autonomous vehicles navigating San Francisco’s streets and envisions AI’s next frontier—using natural language to instantly generate virtual Roblox worlds, eliminating the need for manual coding. - [00:18:39](https://www.youtube.com/watch?v=BmajUTU3gGU&t=1119s) **Balancing Creative Freedom with AI Constraints** - The speakers discuss encouraging unbounded AI creativity while platform constraints ensure practicality, and a developer emphasizes impact‑driven projects such as a sign‑language detection model. - [00:21:49](https://www.youtube.com/watch?v=BmajUTU3gGU&t=1309s) **Missteps in Pilot Engineering** - The speaker outlines IBM's Pilot Engineering method—highlighting the need to pinpoint a business opportunity, perform use‑case discovery, and quantify value—and shares a project failure that resulted from ignoring these crucial early steps. - [00:24:57](https://www.youtube.com/watch?v=BmajUTU3gGU&t=1497s) **Managing Uncertainty in Pilot Projects** - The speaker explains how to design adaptable pilots and use agile, client‑collaborative methods to navigate known and unknown risks, prevent scope creep, and sustain motivation throughout a project. - [00:27:59](https://www.youtube.com/watch?v=BmajUTU3gGU&t=1679s) **Enhancing Transparency of Patient Journeys** - A speaker outlines an idea to create a data‑driven solution that makes hospital patients and families aware of real‑time care steps, aiming to reduce stress, improve outcomes, and lower costs. - [00:31:08](https://www.youtube.com/watch?v=BmajUTU3gGU&t=1868s) **Postmortem of a Misaligned Project** - The speaker reflects on months of wasted effort developing a hospital solution without proper stakeholder validation, recognizing the missed checks, cold shoulder, and lessons learned. - [00:34:20](https://www.youtube.com/watch?v=BmajUTU3gGU&t=2060s) **Fail Fast, Prototype Quickly** - The speaker urges developers to rapidly build and test prototypes, using early feedback to validate ideas and treat setbacks as learning opportunities. ## Full Transcript
0:00My advice would be go with the big 0:02or the biggest model that you can find, and see if it solves a problem. 0:06Then using techniques like distilling or fine tuning 0:09and there are many others quantization see if you can make the model smaller. 0:1580% of all AI pilots fail. 0:18And that's according to the Harvard Business Review. 0:20I did my homework, y'all. 0:21Four out of five, just dead on arrival. 0:24And yet, everywhere we look, it's AI everything. 0:27So what is actually going on behind the scenes? 0:31How do pilots go from something promising to a pilot graveyard? 0:36And if it is working, when and where do you double down on the winners? 0:40I am beyond thrilled to ask these questions and many more to my guests 0:43today. 0:44We'll chat first with Anupam Singh, who is the VP of AI and Growth at Roblox. 0:49Anupam, thanks so much for coming on. 0:52I'm a huge fan of the game and the company. 0:54Hello, how are you doing? 0:56I'm really, really well. 0:57Now, Anupam, though I'm curious, please tell us a bit 1:00about your journey that you took in order to get to your current role. 1:04I am, two time founder of two big data companies. 1:08The second one went on to be acquired by Cloudera. 1:13So if you've ever heard about Hadoop, if you have ever played with open source, 1:17big data platforms, there's a lot you can blame me for. 1:22In what works, what does not work. 1:26Well, can you tell me 1:27also about some of your experience with building AI pilots? 1:30Then before Roblox 1:32and now at Roblox, I know that you mentioned 1:34a little bit of it there, but I want to dig deeper. 1:35Oh, yeah. 1:36So AI, 1:37you know, used to be called machine learning 1:39before everybody started calling it AI. 1:41So we've been doing machine learning for a long, long while. 1:44Take Roblox, for instance. 1:46We have had a text filter for our safety for years and years 1:50where, if you and I were texting each other 1:54and let's say you decided to use a colorful word, 1:58our text filter would in line live hashtag it. 2:03Meaning it's bleeping, but for text. 2:06And that always used machine learning. 2:09It is invoked around 4 billion times a day. 2:13Imagine every one of your text going through this text filter 2:16being filtered out. 2:17And that was our biggest machine learning service. 2:23Before all of AI became the rage in the industry. 2:27Out of those pilots, then how many of them failed? 2:30It's interesting if you narrow down the problem the way I described it. 2:34So we are not trying to boil the ocean. 2:36We're not trying to do AGI. 2:38You simplify, the problem statement 2:42rather than trying to boil everything. 2:45And so some of the failed things that we have seen is 2:48when we try, try to do too much in one go because we get excited 2:52about the technology. 2:54You know, you must have heard something called time to first token. 2:57This is all the rage in AI. 2:58It's the first, it’s the time that it takes to get the first token. 3:02I have started calling something time to first demo. 3:06In AI, building a demo is easy, 3:10but never confuse the demo with actual production. 3:14Never confuse the demo with actual production. 3:16Can you break that down for me a little more? 3:18So let's say you have a video 3:21building software or you are trying to build photos. 3:26One of my AI friends calls it party trick models. 3:29Party trick models is you open your phone and you say, 3:33I want dolphins riding bicycles on Ocean Beach in San Francisco. 3:37Comes out and it's beautiful. 3:39But firstly, it's not useful to anything. 3:42It's a proposal. 3:43Nothing. 3:44Now, if you want to build a real video or a 3D world 3:49which has Ocean Beach, which has the fog of San Francisco, 3:53which has dolphins, it's a much bigger task. 3:56So the demo sometimes confuses people. 3:59I have other companies where people have gone to the board and demoed 4:03something, and the board assumes that it is the full product. 4:07We try to avoid that. 4:08We try to avoid, irrational exuberance around AI demos. 4:13Well, then how widespread is this issue of irrational exuberance? 4:18Honestly, because so many models 4:21are now available, so many large models are so easily available. 4:25I think it's become more common now because you can ask... 4:29the model can almost fool you into thinking that you just solved 4:32a very complex business use case, but it really hasn't. 4:37I'm curious about 4:38where does this usually start though, within the company 4:42or within the organization? 4:43Does it start in the C-suite or more so on the developer level? 4:47At some level, it's at the developer level, because earlier 4:51if you did AI before the large models, you would have to start with the data. 4:56You'd have to figure out which neural network technique you're going to use, 5:00and then you have to figure out about the output. 5:03But now, because large models are available off the shelf 5:08and and many companies are making them accessible, your first step is much easier 5:15but now all of it portends well for the future. 5:19But just because it is easy to do the first step, 5:24people confuse it with the with the end, with the conclusion of the project. 5:28Does that make it almost too easy to step into piloting? 5:31In some ways, yes. 5:32It is very easy to articulate a business problem. 5:35Have a demo ready and in your mind you're now not thinking about cost. 5:40You're not thinking about data quality, and you're not thinking about 5:44user experience. 5:45If you take these three away, the project becomes easy. 5:48But if your AI model is about to curse at you, 5:52or it is going to give you factually wrong answers, 5:55how do you assure that that's not happening? 5:58And those are the last 10% of an AI project might be harder 6:02than the first 90%. 6:04So what else are people missing as they start the piloting process? 6:07I think, the first one, I would say, because Roblox is at 6:11such a high scale, we have to think about cost. 6:14We like to give out our features to all of our users and all of our creators. 6:20So from day one, we have to think about, publicly. 6:23We talk about 4 billion hours of engagement. 6:27So it's 4 billion hours of people talking to each other, people 6:30playing with each other, people building content. 6:34And we need to think at that scale. 6:36So often 6:38people will forget the number of tokens that are actually going to be used. 6:42And then when you multiply it by cost of running it on the cloud, 6:47you're suddenly, your beautiful AI project could cost 6:50your company $100 million straight up. 6:53And the bills are running up the way cloud bills do. 6:56They start very innocuously, 6:57and then suddenly it's a huge cash outlay for your company. 7:01Okay, you've identified a couple of huge problems here, right then Anupam. 7:06So I need your help now in helping us identify 7:09some hacks on how to build something that has staying power. 7:12So number one, let's talk about cost, okay. 7:16On cost, consider whether you really need that large a model. 7:21There's a little bit of an ego battle. 7:24It's like, oh, you know, John is using a 100 billion 7:28parameter model, but Arvind is using a 10 billion parameter model. 7:33Oh, obviously 100 billion is better than 10 billion. 7:36Not true. 7:37Because if your business problem is constrained enough, 7:40if your user experience question is constrained enough, 7:45you can actually work very well with 10 billion parameter model. 7:50What happens next? 7:51A 10 billion parameter model sometimes 7:54is 100 times cheaper than, let's say, 100 billion parameter model. 7:58So it's very important to constrain the problem 8:02and then, use that constraint 8:05to reduce the footprint of your machine learning model. 8:08I've see now you've got me in two different mindsets though, 8:11because if I'm going to start small, how can I start to think about scalability? 8:15When I say start small, 8:17I don't mean that you should always start with the smallest model possible. 8:21You can start with the biggest model, you know, whether it's open source, 8:25whether it's your own. 8:26Take the biggest model, see if it solves your user's question. 8:32Let me give you an example. 8:33We started on something called Code Assistant. 8:36Millions of creators come to our platform and they write code. 8:40Of course, having a code assistant is great, 8:43but we started with a very big model. 8:46We actually put it in what you would call a pilot, 8:50saw that your users are developers really like it. 8:53Then behind the scenes, because we had a gateway sitting in between, 8:57behind the scenes, we changed the model to a smaller model and tested it. 9:01We changed it to a smaller open source model. 9:05So my advice would be go with the big 9:07or the biggest model that you can find, and see if it solves a problem. 9:11Then using techniques like distilling or fine tuning 9:15and there are many others quantization see if you can make the model smaller. 9:19So that's sort of the first step. 9:22Gotcha. Okay. 9:22So see if I can make the model smaller. 9:25Now let's fast forward some. 9:27I've got my pilot and I'm ready to grow now, Anupam, I want to grow. 9:31So I need a team. 9:33How easy is it going to be for me to be able to find people 9:36who have the right skills in the AI space, like right now, today? 9:40That's a brilliant question. I'll tell you why. 9:42Because whenever we think about AI, we think about models. 9:45Whenever we think about models, 9:47we start thinking about people who can train these models. 9:50But what we have learned here is the magic 9:53starts a little before, which is data. 9:57So preparing data for AI is very different 10:00than preparing data for business intelligence. 10:03If you remember your business intelligence software, what does it do? 10:06It creates charts. 10:08It creates pie charts and swim lanes and whatnot. 10:10But preparing data for AI is different. 10:13You have to create vector embeddings. 10:15You have to put it into a vector database. 10:18We have learned that right now 10:21some of it is art and some of it is science. 10:25And finding that right mix in the engineer 10:28to think about data is sort of the hardest skill in the market. 10:32I want to give a shout out, by the way, to everyone who's listening right now. 10:35Especially some engineers that I'm sure are tuning on into this one. 10:39I want you to speak directly to them, though. 10:41Let them know, how can today's engineers start learning these skills? 10:46If you are a database person or if you are data person, 10:49start thinking about vector databases. 10:53Start thinking about what an embedding looks like. 10:56It's not that bad. It's still a table. 10:59It's still data. 11:00Okay, data stays data. 11:01So that's one way you can get in. 11:04The second, of course, is learn 11:07a few model optimization techniques. 11:10Not all of us have to go and build GPT. 11:15Not all of us have to build these bigger models. 11:18If you want to get your your feet wet on AI, go in and just learn 11:23these techniques like RAG, fine tuning, quantization and just play with a model. 11:29You don't have to do the training yourself. 11:32And the third kind of engineer who will try, we didn't 11:35talk about performance as such, I emphasize cost, I emphasize data quality. 11:41But in the end, when you are typing in and your 11:44AI is taking too much time to respond, you as a user are not going to like it. 11:49So what is software about? 11:51Since many decades ago, it's about performance 11:55and a lot of that performance is still distributed systems engineering. 11:59And so if you are a distributed systems engineer, 12:02you still have a lot to contribute, because in AI inference, 12:06all of these older techniques are coming back. 12:09Since you're here, we obviously have to talk about your work at Roblox. 12:13All right. 12:14So in your view, how's AI shaping the future of gaming? 12:18Oh, I thought you would never ask. 12:21So so, there's a few things that are 12:24that are all active simultaneously. 12:27So it's a very exciting time to be involved with Roblox and AI. 12:32So the first one is, of course, creation itself. 12:36You want to create a game. 12:37You come to the Roblox platform, we'll help you with texturing. 12:40We will help you with code assistance. Okay. 12:44Your creation is ready, 12:46but now you have to send it out to millions and millions of users. 12:50In our case, it's essentially every corner of the planet. 12:54We have about 24 data centers 12:58at the edge where your creation is going to go. 13:02We have features which help you with rendering. 13:05As a creator, you don't want to spend too much time rendering it. 13:10So that's the creator. 13:11Now, let's say you're a user. 13:14Now, I don't know about you, but I have a really snazzy avatar on Roblox. 13:19It's got a beautiful white coat and it's it's 13:22got a, you know, weapon and a very nice mohawk. 13:25Imagine trying to build your avatar used to be painful, but avatar 13:29auto setup on Roblox immediately gets you avatar to look amazing. 13:33So just these three things, creators, users get enabled. 13:39Behind the scenes though, 13:41one of the biggest areas of investment 13:44always has been since Roblox was founded is safety. 13:48And a lot of our AI, I would say almost half of our 13:53AI spin is on safety. 13:56Whether you are texting a friend, chatting with a friend on Roblox, 14:01and whether you can be colorful or not, that's something that our AI helps with. 14:06But the most exciting one pertinent to our conversation right now is voice. 14:12You and I are chatting. 14:13This is, you know, a professional setting. 14:16If I suddenly used a colorful word, I think we should beep it, correct? 14:20Sure. 14:22Now, do you want to be 14:24in post-production or do you want to beep it in here? 14:28Right. 14:29And so our voice safety effort is that, 14:33depending on the context, 14:36we will actually immediately identify whether our voice interaction 14:41is safe or not. 14:42And that's been one of the most successful projects on AI. 14:46To your earlier question about pilots, I remember the pilot. 14:50It seemed very expensive. 14:52It seemed the model was too big. 14:54It seemed the data quality was not high. 14:57In a one year journey. 14:58We have addressed each one of these things and now the safety group 15:03is the biggest user and consumer of our AI, platform. 15:07I can only imagine the colorful language that we would hear 15:11if we just stood outside the door during that training. 15:13We had this great demo in which our, vice president of engineering for safety, 15:18would be talking to our founder and CEO 15:21in the town hall, and it would nudge them to be less colorful. 15:25So it was a pretty impressive demo. 15:27Well, then what do you see coming next then as gen AI gets 15:30better and pilots do start improving what's on the horizon? 15:34Firstly, if you live in San Francisco, hopefully you have seen these 15:38driverless cars. 15:39It's amazing that, literally 15:43in real time, this car is able to reason 15:46about what it is, seeing, what it should do. 15:49And because San Francisco is very narrow and steep street, sometimes 15:52it has to back away to make let a garbage truck go by. 15:56So in a way, I feel excited that the future is already here 16:00in certain parts of the world. 16:02So that's that's part one. 16:04Part two, I think all the AI excitement today 16:07is legitimately about text. 16:10You know, I type in something, it type something back, 16:14but the next horizon is image, which is already here. 16:18People are talking about image. 16:19Then we are thinking about audio. 16:22Then we are thinking about video. 16:24And then we might think about the world itself. 16:28And for us, because we live at the intersection 16:31of the real world and the virtual world, building these worlds 16:36in the most efficient way possible is a big deal. 16:39It takes too much effort for our creators 16:42to take what is in their mind 16:45and translate it into working code. 16:48We want to sit in between and let you become a storyteller on Roblox 16:53versus today, what you would have to do is you would have to write some code, 16:56and you'd have to learn how to, you know, build a Roblox world. 16:59Imagine you could just say it and out pops a Roblox game, 17:04and you and I are playing that in five minutes. 17:07That's the future for us at least. 17:09So is there a world post code then? 17:11Is that what's on the horizon? 17:12It's possible. 17:14The beauty about AI today, apart from obviously what we started with, yes, 17:18there's a lot of failures that we will encounter like any other innovation. 17:22But every three months an entire set of 17:27of assumptions get broken and we see something new. 17:31So it's an exciting time, to be working in machine learning and AI, for sure. 17:36Then my final question with that is, 17:38with everything becoming so much more accessible, 17:41with creating pilots, becoming more accessible, 17:44do you think that's going to lead to more pilot failure in the future, 17:48since people can feel 17:49so confident about it and just jump in, even without a ton of experience? 17:53Or do you think that that's going to improve the quality of pilots that we get? 17:56I'll tell you what we saw here. 17:58The sophistication will not be in the pilots. 18:02The sophistication will be AI platforms 18:05will become smarter to figure out 18:08that this is not a good use case, or that the data quality is not good, 18:13or the inference is too slow, or it is too expensive. 18:17You know how, if you go back to the time when cloud was new, I could just 18:23you know, ask my boss to give maybe, 18:26maybe their credit card and put it on the company tab. 18:29And that's it. That's my cloud pilot. 18:31Today, cloud has an entire set of disciplines, 18:35entire set of process to manage the cloud spend and performance. 18:39I think the AI platforms will become sophisticated enough 18:43that they will catch these pilots. 18:45But I think we should let creativity be unbounded. 18:48We should encourage our engineers, our developers, our creators 18:52to do a lot of new things with AI while 18:55the platform constrains what's going on. 18:58Okay, Anupam, thank you so much for your time. 19:01Listening to you, 19:02it makes me feel like the future is going to be super duper bright. 19:05But today people putting their blood, sweat and tears into pilots that never see 19:10the light of day still just breaks my heart as a creator myself. 19:13So I've called in my friend IBM's Nick Renotte. 19:17He's going to help me 19:18to understand his point of view as one of the brilliant minds behind AI builds. 19:22Nick, welcome back to the show, man. 19:24It's really good to see you. 19:25Yeah, no, thanks. 19:26Thanks for having me. 19:27Now, look, as someone who's really in the weeds 19:29developing individual AI projects every day, let's start here. 19:33Does it begin like any other creative endeavor? 19:37Like, like just a fun idea. 19:38The biggest thing that I always think about when I'm 19:42designing or coming up with ideas is like impact. 19:45So if I build this, how is it going to be able to be used? 19:48Who's going to be able to use it? Where is it going to go? 19:51What's the potential? 19:52One of the most popular projects that I actually built on YouTube 19:56was the sign language detection model. 19:58And there's been other 19:59like a ton of other object detection models that that I'd seen 20:02floating around YouTube before. 20:04But I thought, hey, this one's actually got some, some, some real use and like, 20:08maybe I won't push this to the absolute Nth degree, 20:12but if I can give people a start, it'll actually help kick people off. 20:17And I actually had, a father reach out to me yesterday 20:21who's actually taking the model that I helped start designing on YouTube. 20:26And he's using object detection and embedding that 20:29inside of a set of glasses for his daughter, who is blind, 20:34to help identify obstacles, in a way. 20:38So I thought that was a really cool 20:40sort of evolution of the project that I sort of started off with. 20:43Geez, how validating that must have sounded to you like to know 20:48that something that you just dreamt of on a whim and just started, 20:51you know, doing is actually being placed in real life application? 20:55Congratulations on that. 20:56Yeah. Thank you, thank you. 20:57And, I mean, the funny thing is that, like, 20:59there's been so many different offshoots from from that project, there's obviously 21:03been others, but like this one because it's had that, that, that greater for, 21:07for the greater good type feel to it or or vibe to it. 21:11It's really gone a lot further. 21:13I know clearly that you have a ton of ideas. 21:16They must just like, hit you at all different points in time. 21:19But when you're trying to narrow those ideas down 21:22and when you really try to plan a pilot, how do you try to make sure 21:26that the pilots that you do endeavor to create actually 21:30grow up, actually make it far enough to to take root? 21:34Yeah, I think I'm going to switch speeds here, right. 21:36And then focus a little bit more on like how I do it at work 21:39as opposed to like personal projects, because a lot of my personal ones 21:42sort of just go off the wayside, kind of like, the, the amazing footage 21:45that you mentioned at the start, like they're always on the to do list. 21:49Whereas, 21:50when I'm at work, I'm a lot more driven to make sure that these do grow up. 21:53So I stay within inside of an organization inside of IBM called Client Engineering. 21:58And we actually created this framework called the Pilot Engineering method. 22:02One of the first steps when it comes to kicking off 22:06any pilot is actually looking for a specific business opportunity. 22:11And then looking at use case discovery. 22:13Those two steps ensure that you're at least going down the right 22:17path to begin with. 22:18I think we're going to talk about 22:19like one of my biggest failures a little bit later on. 22:21But like the reason that that that pilot failed or the project 22:25that I was working on failed was because I didn't do that. 22:28So I didn't look and identify a valid business opportunity. 22:32I didn't do proper use case discovery. 22:35And then most importantly, I didn't quantify, the business value. 22:39Like, how much is that worth to the organization 22:43to solve this problem, because then they're like, 22:45they're actually going to want to solve that particular problem. 22:48Like, I always talk about it, or I'll refer to it as the bleeding neck problem. 22:52Like if you're next bleeding, you really want to solve that problem, right? 22:55Is what we're trying to solve a bleeding neck problem. 22:57So when it comes to to 22:58to actually going through that process, I think those two steps are critical. 23:02And that all sort of starts at the initial phases of the project. 23:07Right? 23:07So to actually doing workshops and working with a client like this 23:10stuff can't happen in isolation. 23:12It's absolutely critical that the when you're looking for that, 23:15that that business case and when you're looking for the use case 23:18that you sit down with the people that are going to be using it, 23:21not not just the project sponsor or the person funding it, it's like, 23:25is this going to be valid to the person that's actually going to be using it in 23:29their day to day life? 23:30And then from there, not just sort of disconnecting, they need 23:34to be critically engaged at every particular step of that, that project. 23:38So, once we sort of decide on that, then, then we really go into what 23:42we call the co-creation phase. 23:44And keep in mind, like co is the really important part in that it's 23:49not just engineers building stuff off on the side on their machine 23:53just hacking away. 23:54It's building alongside our clients. 23:57And then once we've gone through that process, 23:59along with a number of playbacks, it's really about transitioning, 24:03this off into production because you've probably seen it before, 24:07right, that 24:07there's a bunch of projects they start off that they're really great, 24:10and then they never really make it out there into the world. 24:14And that happens a lot, right? 24:15Like there's a lot of developers building a lot of stuff and like, 24:17there's a lot of pilots being spun up. 24:19But if you haven't actually gone and established that, that business value 24:23looking specifically at a valid use case, then there's not going to be, 24:27a bleeding neck need to actually get this into production 24:31or actually use it with inside the organization. 24:34So, going through those steps, the workshops, the builds, the transition, 24:38we bake that into each one of our projects and our pilots 24:41to make sure that they do grow up. 24:42When you mentioned the co part, Nick, that to me really seems to kind of 24:47like sum up everything that you shared there, 24:50because just listening, for me, I'm like, wow, 24:52it would be so much fun to brainstorm and to come up with different things. 24:57But at the same time, even though that is such a thrill, 25:00that also places so much responsibility on just you 25:03as one person to stay personally motivated and to push it on through. 25:07But when you're working with the client, 25:09like you can't just shrug it on off because you become bored 25:12of the project anymore. 25:13So since you can't plan for everything, how do you try to design your pilots 25:19so that they can adapt to what you can't know? 25:22But like at the same time, you can't know what you need to build into them. 25:25So I feel like we're stuck in this circle. 25:28So how do you fix it? 25:29There is that loop, right? Yeah. 25:30Like there's the known knowns, the known unknowns. 25:33But then there's always going to be the unknown unknowns, right? 25:36This comes with every project, right? 25:38It's not just AI projects that that have instances of scope creep 25:42where something creeps in and it's like, oh, this is an absolute critical thing 25:47that we didn't mention at the start that we now need to go and handle. 25:49Part of good project management sort of hedges against that. 25:52Right? 25:52So like that, 25:53like agile project management makes sure that like we're prioritizing 25:56to build stuff 25:58that is absolutely critical to ensuring that this particular pilot succeeds. 26:03A lot of the risk, associated that with that can be sort of hedged 26:08or mitigated against by doing that use case discovery and 26:11working alongside the client. 26:13Because really quickly, if the client's testing as you're building, 26:16they'll probably point out like, oh, we kind of have that filter there or oh, 26:19I need that filter there, or oh, we can't show that data because it's 26:22super sensitive and this person shouldn't have access to it. 26:26So co-creating, 26:28sort of alleviates a 26:28lot of that other concern, but also doing playbacks. 26:32So whenever I do pilots, I try to make sure that anybody that 26:36is going to have a potential touchpoint or is going to be a decision maker 26:41for this potential solution going live, needs to be in those playbacks. 26:45They need to be saying what 26:46we're building, because you never know how one particular workflow 26:50might impact somebody else until somebody else is there in that room. 26:55So, making sure that that, that those happen 26:58proactively and not reactively is, is critical as well. 27:01Well, look, it's funny because hearing you share all of that, 27:05as a listener, I'm like, you know, well, it sounds like Nick has it all down. 27:09He's got it all together. 27:10So how could anything positive go wrong like you? 27:15No, not at all. 27:15What is it like Bear Grylls, adapt, overcome, something else, like.... 27:18Right, right. 27:21So. But I know that that's not true because you teased us 27:24a little earlier by telling us that there was a failure. 27:28So I'm going to kind of push on the wound a little bit here, 27:31and I'm going to ask you 27:32if you can tell us some more about that failed pilot that you worked on. 27:35Yeah, yeah. 27:36Thankfully, this, this, this wasn’t a pilot at IBM 27:40so that there's no brand rep damage likely to occur as a result of this. 27:44A while ago, so I was just coming out of university. 27:48I was doing my master's. 27:50And as part of that, we had a, like, a startup incubator. 27:54And then our startup got picked, to go into the incubator and, and to, to, 27:58to build it out. 27:59And the whole premise of it was like we were going to be building 28:02an application or like a solution to help increase transparency of patient journeys 28:07throughout a hospital, because like one of the big factors 28:09when you're inside of a hospital is like, you don't really know what's happening 28:13and you don't really know what's going to be happening next. 28:17And that can really increase the stress and concern, 28:21when you're going through that process. 28:23So you're getting treatment. 28:24Now, I thought this was a massive problem 28:27because my girlfriend had just been through hospital. 28:30She was going through surgery, and it was an absolute nightmare. 28:33Like we went into the hospital ward one time and they're like, who are you? 28:37Like, why are you in this bed? 28:39And I was like, as if it's not enough to know what it's going to be happening 28:43to us at that particular point in time. They don't know who we are. 28:46And so I'm like, like, does anyone know what's happening here? 28:49So I'm like, oh, this is a great idea. 28:51Like, we can help improve transparency in hospitals, improve patient experience. 28:56And that has a knock on effect. 28:57There's a bunch of research studies that like improving patient experience ensures, 29:01like reduces the likelihood of readmissions, it improves recovery rates. 29:05It and reduces like a whole ton 29:07of, costs associated to that hospital and providing care for that patient. 29:12And I'm like, okay, let's build a solution for this. 29:15Like I'm like, I'm I can do data science. 29:17I can build machine learning models. 29:18This is awesome. We're going to bake it all into it. 29:21So we build for like three months. 29:24Right. 29:25And we just keep building like we have a meeting with the with the like. 29:28We actually got a hospital on board to help us out with this. Right. 29:31Or like to, to potentially take on the solution. 29:33So we have a meeting where like we, we're thinking about building this 29:35like yep, great. Cool awesome. Build it. 29:38So, heads down we build for like two months, like just coding. 29:41Like, so I'm working my day job Monday to Friday and then going to like 29:47the incubator from like 6 to 9 p.m., like Monday to Friday. 29:52And then I'm working Saturdays and Sundays, building this up 29:55with my co-founder, build it, build it, build it for three months, 29:59and I'm like, hey, we should really should be showing this back 30:02to the client and going, oh, like to the hospital. 30:03Like, hey, what do you think about this? 30:05And at that time we had, like a startup advisor, like he was like, 30:09he was the one that got us the contact with the hospital. 30:11And he's like, hey, at this point, right? 30:13Like, you've built more than an MVP. 30:16You really should be looking to, to get some sort of funding or like, 30:20like get the client to start paying for licenses, 30:22even if they're just token licenses and like lifetime licenses 30:26just because they're your foundation client. 30:28I'm like, okay, great. 30:30So we've got a product. 30:31We, or a minimum viable product. 30:34Like it's enough to charge for. 30:35So we go to the client 30:36like, hey, we've built it, we can hook in, we can do all of this stuff. 30:39We went through a ton of regulation to make sure that we have it, 30:42and now we're like, okay, like we've built it all up. 30:44We're going to give you like unlimited licenses for it 30:47for the entire lifetime, and it's going to cost you X dollars. 30:51And the client's like, 30:53yeah, but 30:56like, I don't know if we really need it now. 30:58And I’m like, what? 31:00Like, what do you mean, like, I'm like, I was still like, 31:02I didn't have as much experience in terms of, like, 31:05selling and building startups and doing like that. 31:08That whole process, and just sat there in that room and I'm like, 31:12I don't even know how to interpret that. 31:13Like what? 31:14Like you don't like there's like, you don't need it and you don't need it now. 31:17So I like eventually I got like I gathered myself and I'm like, 31:20okay, when, when do you think you would need it? 31:22They're like, 31:24maybe you should go back to the drawing board and like, review it. 31:27And, I was just like, 31:29like eventually I recognized I was getting the cold shoulder 31:31and I'm like, oh, this is just like, I, we just wasted so much time. 31:35Like, like three, four months of our lives building this, like, on top of, 31:39like, the stuff that we did in the incubator prior to the hospital. 31:42And it's just like, exhausting. 31:44Like I was like, I'm gonna, there is no value, right? 31:47There's opportunities and there's lessons learned. 31:49But like at that point in time, like I can still see it. 31:53Like I was sitting in that, that, that, 31:55that like a demountable room because they were going through a reconstruction. 31:59Or like construction of the, more construction on the hospital. 32:01And I was like, oh, man, I can't believe we didn't validate this way more. 32:05I can't believe we didn't sit down with them and like, triple 32:08check that this is what they wanted, that this is what 32:11what they were going to be able to pay for. 32:13And I've just gone, like I could have had like, mitigated so much of this 32:19well ahead of time instead of waiting until the end to, to present that. 32:23But I mean, you live in you learn, right? 32:25That hurts. You just, and you 32:27just painted that room for us. 32:28You dragged us back into your trauma. Nick. Thanks for that. 32:31I'm assuming that the reason that you can remember that one so well 32:34is because there are quite a few pilots that you've worked on that do succeed, 32:39right, that have succeeded. 32:40So I know you've got a bunch of those. 32:42What made those pilots different than than this one? 32:47Because from where I'm sitting, it sounds as though you were able to identify, 32:52like maybe a commitment going forth with everything else that you worked on. 32:57Was that one of the first things you did? 32:58Yeah, definitely. 32:59So like when we spin up pilots now, there's a whole process, right? 33:02It's not just like, hey, there's a you want to do a pilot? 33:05Cool. Let's go do a pilot and start building. Right. 33:07We actually go through, ways of working. 33:10So like we set up a document of understanding, so 33:13like what we're going to deliver, what the commitment on the client is. 33:16So, like, what are they going to be able to do? 33:18Do they have the sufficient amount of time to test this. 33:20Because again, it's co-creation, not just IBM creating. 33:23Right. Like we're trying to build stuff to, to help their organizations. 33:26Close engagement with the client is absolutely critical. 33:29Like if I look back on like my startup, like we got data from the hospital, 33:33but it it was like, I want to say 1% of the amount of data that we needed. 33:37So ensuring that you have the right data available, all the right data 33:40prepared and the right data collected is absolutely critical. 33:43I want to say that, like a lot of the the pilots 33:46that we do now are really successful because of the people, right? 33:49Like we have the right people in the right room. 33:51And that's not just IBMers, but that's also like the client 33:54coming together with their right people that are going to help push this forward 33:57and make sure that it's successful, because ultimately, 34:00like every pilot ends with some sort of presentation 34:03to an executive or like a funder or like an executive sponsor, 34:07having everyone in the room and making sure that the like the right 34:11people are in the room ensures that, like when we go and make that pitch that, 34:14that, that, that they know that, hey, 34:16it's not just something that some, some random group is gone and built. 34:20It's like, this has been a collective, like we've actually gone, gone ahead. 34:23We've scoped this out properly. 34:25We found a problem that needs to be solved, 34:27and we've made sure that that that it'll work 34:29and it'll deliver the business value that, that that's needed. 34:32I imagine right now, I hope that there are a bunch of developers 34:36that are listening to us, listening to you specifically right now. 34:39Can you give them one word of advice 34:41as they think about whatever pilot it is that they're dreaming of giving them? 34:45Just at least one word? 34:46My personal favorite is fail fast. 34:50Build a prototype as quickly as you humanly can and validate it. 34:55Right? 34:55So like build it, go and show it to to the person that you think is potentially 34:59going to be using this or buying this. 35:01So that's really quickly going to validate or invalidate 35:05what it is that you're building. 35:06That, that that's probably a big one around startups. 35:09Right. 35:09But if we had built in what we built in like, 35:14two ways before, instead of getting way too far 35:17down into the MVP and said, hey, will you buy this right now? 35:20Or like Will, is this exactly what your organization needs? 35:23And really quickly, it's a qualifying statement or a disqualifying statement? 35:27Super early on. 35:28So, failure is not really failure, right? 35:31It's just, it's the lesson learned. 35:32Yeah. 35:32It sounds like you fell in love with your pilot, Nick. 35:35Oh, man, 100% I was committed. 35:37I was like, this is this is so cool. 35:39It's going to improve people's lives. 35:41It's going to make hospitals way better. 35:42Not the hospitals are great anyway, but, like, it's gonna just get, 35:46it’s world changing. 35:47Yeah. Like, look, I'm still kind of vested in that idea. 35:50I probably need to go 35:51and do a ton more research and approach it from a different way. 35:54But now, like, there's 35:55so many other problems that I can solve that I've got deep expertise 35:58in, like, I'm not a doctor, I'm not had like I wouldn’t, 36:01like I don’t know the first thing about patient treatment. 36:03But I was like, I'm vested in this. 36:05I want to solve it. 36:06Look, and you clearly have a ton of free time, 36:09so I'm sure that you'll pick this one up in your spare time, Nick. 36:13Yeah. 36:13Just like I'll do it between, sleeping and, and working, right. 36:18Well, with all that then said, thank you for squeezing us on in then 36:22within your time, I really do appreciate you getting on here, enlightening us. 36:26It's always a true joy to speak with you. 36:28And also thank you again to Anupam. 36:31This has been a fantastic episode and that's it for today's episode, 36:34so we appreciate you all for listening as always, 36:37and we will see you back here very, very soon, I promise.