Learning Library

← Back to Library

OpenAI Dev Day: Builder Era Begins

Key Points

  • OpenAI’s recent “Dev Day” rollout wasn’t about new consumer features but a suite of developer tools—including an Apps SDK and a nascent app‑store model—designed to make ChatGPT the core compute platform for third‑party services.
  • By rewarding “token‑heavy” users with plaques, OpenAI signaled its strategy to shift computing from bits‑and‑bytes to tokens, positioning itself as the future infrastructure provider for AI‑driven applications.
  • This launch marks the “builder stage” of AI, meaning 2024‑2026 is a prime window for developers to create and monetize AI products, while the broader market‑ready apps that will make AI feel “real” for end users are still emerging.
  • For AI leaders and adopters, the key takeaway is to evaluate partnerships based on who supplies the underlying token‑compute platform rather than just brand dominance, as the ecosystem’s success will hinge on the robustness of the developer‑first infrastructure.

Full Transcript

# OpenAI Dev Day: Builder Era Begins **Source:** [https://www.youtube.com/watch?v=prODjJ9oQyM](https://www.youtube.com/watch?v=prODjJ9oQyM) **Duration:** 00:21:01 ## Summary - OpenAI’s recent “Dev Day” rollout wasn’t about new consumer features but a suite of developer tools—including an Apps SDK and a nascent app‑store model—designed to make ChatGPT the core compute platform for third‑party services. - By rewarding “token‑heavy” users with plaques, OpenAI signaled its strategy to shift computing from bits‑and‑bytes to tokens, positioning itself as the future infrastructure provider for AI‑driven applications. - This launch marks the “builder stage” of AI, meaning 2024‑2026 is a prime window for developers to create and monetize AI products, while the broader market‑ready apps that will make AI feel “real” for end users are still emerging. - For AI leaders and adopters, the key takeaway is to evaluate partnerships based on who supplies the underlying token‑compute platform rather than just brand dominance, as the ecosystem’s success will hinge on the robustness of the developer‑first infrastructure. ## Sections - [00:00:00](https://www.youtube.com/watch?v=prODjJ9oQyM&t=0s) **Untitled Section** - - [00:03:54](https://www.youtube.com/watch?v=prODjJ9oQyM&t=234s) **ChatGPT App Boom, Not a Lockdown** - The speaker contends that although the ChatGPT ecosystem creates massive builder opportunities akin to the iPhone era, fierce competition among multiple models and startups means no single firm has secured market dominance. - [00:07:42](https://www.youtube.com/watch?v=prODjJ9oQyM&t=462s) **Navigating Multimodel Platform Lock‑In** - The speaker debates whether developers should embrace a single‑vendor AI builder like OpenAI’s agent platform or pursue a flexible, multimodel strategy—highlighting Azure’s open‑model stance as a contrasting approach. - [00:11:32](https://www.youtube.com/watch?v=prODjJ9oQyM&t=692s) **Google Gemini Undercuts OpenAI Pricing** - The speaker argues that Google's inexpensive, TPU‑powered Gemini creates a price floor that stops OpenAI from charging premiums, prompting enterprises to cut token usage through cheaper models and prompt‑engineering to lower their AI cloud bills. - [00:14:41](https://www.youtube.com/watch?v=prODjJ9oQyM&t=881s) **Future Scenarios for AI Model Platforms** - The speaker outlines three possible market outcomes—a dominant OpenAI layer above commoditized models, a fragmented ecosystem where developers directly choose among various models, and a hybrid enterprise‑collaboration scenario. - [00:18:07](https://www.youtube.com/watch?v=prODjJ9oQyM&t=1087s) **Betting on AI Futures & Builder Advice** - The speaker assigns roughly 45% to industry fragmentation, 25‑30% to AI becoming the next “AWS,” and the remaining 20% to integration scenarios, then encourages developers to aggressively build across all AI platforms to capitalize on the current boom. ## Full Transcript
0:00OpenAI, the makers of Chat GPT, launched 0:02a ton of new products. And it wasn't 0:04Chat GPT. It was everything else. It was 0:06everything for developers to build the 0:09ecosystem of AI. And I want to get into 0:12why we should all care about that 0:14because the public narrative is really 0:16clear. It's, you know, Open AI is 0:17dominating everything. They're building 0:19everything. And who else can compete 0:21with them? But it's a lot more 0:23complicated than that. And I want to get 0:25into where you have an opportunity if 0:28you're in the AI space building. I want 0:30to get into what you should take away if 0:32you're in leadership and you're looking 0:34at which AI company to go with. And I 0:36want to get into what we the people we 0:38the people who are just interested in AI 0:40and trying to figure out how to use AI 0:42in our careers should take away from 0:44this. The first thing to think about is 0:47that this wasn't aimed at all of us. 0:49This was aimed at developers. And that 0:52gives you a clue as to where AI is 0:55today. AI is at the builder stage today. 0:59I keep emphasizing this in my Substack. 1:01If you are a builder, this is your year. 1:052026 is also your year. This is a moment 1:09when the applications that will make AI 1:11feel real for everybody don't yet fully 1:15exist. And I know what you're saying. 1:17You're saying, Nate, the irony. We are 1:19talking about dev day and look what 1:21openai launched. This is the application 1:24of the future. I've seen those newspaper 1:26articles because what they launched is 1:28effectively a play on Apple's app store. 1:31So, you know, Apple famously launched a 1:34hardware device that became a platform. 1:36Then they launched the app store and 1:38then the app store became how they 1:40monetize. It took a Supreme Court case 1:42to change that actually, funny enough. 1:45But OpenAI wants the same play. They 1:47want to say we have as they announced 1:49800 million weekly active users. We have 1:52so many people. We are going to launch 1:54an app store within our platform and 1:57everybody will use it. And so that's 2:00what they announced. It's called apps 2:02SDK software development kit. Third 2:05party apps can now integrate directly 2:07into chat GPT. Spotify, Calendarly, 2:10think Quickbooks. Think Zoom. All of 2:13these apps now can have a home inside 2:16chat GPT. So what does that mean? That 2:19means chat GPT wants to be the computing 2:22layer for the future. Their their basic 2:25thesis, right? The idea is for 70 years 2:28or however long we've had computers, we 2:31have computed in bits and bytes and now 2:33we compute in tokens. So if we compute 2:35in tokens, why don't run all those 2:37tokens through OpenAI? I don't think it 2:40was a coincidence that they literally 2:43gave out highly visible awards to the 2:46people who spent the most tokens with 2:48them. I saw some of the plaques online, 2:50right? 10 billion tokens, a trillion 2:53tokens. They put the names up on the 2:55stage there with Sam Alman. They want to 2:57tell you two things. They want to tell 2:59you, one, they have all of the chips to 3:01serve those tokens. And two, the future 3:04is about computing with tokens and the 3:06future is with them. But the thing is, 3:09it's not as clear as they made it sound 3:13from the stage. I'm going to give you a 3:14couple of examples, and I want you to 3:16start to wrestle with this. Whatever 3:17your role in the AI ecosystem, the 3:19questions underneath the PR are what 3:22matters the most. So, question number 3:24one, is the future 3:27as 3:28drag and drop as they are suggesting? Is 3:31the future as app within an app as they 3:33are suggesting? Those were two of the 3:35major plays they made, right? I talked 3:37about the apps SDK. This idea that you 3:39would operate everything in this little 3:41app window inside chat GPT or the future 3:44is building with agents. I talked about 3:47that where you actually can drag and 3:48drop and put agents into a linear flow. 3:52Now, don't get me wrong, there's a lot 3:54of functionality there. Just like in the 3:56iPhone moment, there are going to be 3:58builders that turn into millionaires 4:01overnight because they take advantage of 4:02the app store moment for chat GPT, 4:05because they take advantage of being 4:07able to build agents with chat GPT when 4:09everybody else is still trying to figure 4:11it out. So those opportunities are 4:12there, but that is different from saying 4:15that they have achieved a lock on the 4:18market and I don't think they have. And 4:20the reason why is actually pretty 4:21simple. developers 4:23like the current world we have where we 4:26have multiple models competing often 4:29viciously to offer cheaper and cheaper 4:31prices for tokens competing to deliver 4:33better and better and better experiences 4:35cursor cla codeex competing back and 4:38forth it is great for developers 4:41developers have never been so catered to 4:44right uh and if you're getting into 4:46building and you are using vibe coding 4:49to build or vibe engineering as I heard 4:51it Great. You also have never been so 4:54catered to. We have billiondoll startups 4:58competing for your attention and for 5:00your dollar. In that world, do you 5:03really want to lock in with open AI? 5:05That is that is the question and that is 5:08why this is not the same as the iPhone 5:11moment. Think back to 2007. Well, I have 5:13the gray hairs. I'm thinking back to 5:152007. iPhone was the first truly 5:18interactive multimedia phone that had 5:21product market fit. Before that, yes, 5:23there were phones with screens. Yes, 5:25there were some phones with apps. Yes, 5:27there was Blackberry. But none of them 5:30had 5:32a intuitive, clean platform that anybody 5:36could plug things into and just build. 5:39And that is what iPhone did with the App 5:40Store. I remember one of the first 5:42successful apps was the the beer leveler 5:45where you would just tilt your iPhone 5:46screen. I swear this sounds like the 5:48dumbest thing now, but like it was all 5:50the rage because we didn't know that you 5:51had like uh what whatever the 5:53accelerometer or whatever that 5:54determined the level. And you would just 5:56be able to sit there and play with it 5:58and watch the simulated sort of pixels 6:00dash across the screen as the beer 6:02stayed level no matter how you turn the 6:03phone. This is what passed for 6:04entertainment in 2007. But now 6:08we don't have that world anymore. We 6:12don't we have a world where Claude has 6:15the MCP server ecosystem and has open 6:18source that and everybody loves it and 6:19Google's using it and OpenAI is also 6:21using it. Anthropic is excellent at 6:24making Excel at making PowerPoint. 6:26They're going after work primitives. 6:27They have cloud code. Google is out 6:29there with really aggressive pricing per 6:31token. And so they are going after 6:34anyone who cares about sort of price 6:36sensitivity for intelligence and they 6:38have an immense hardware stack and very 6:41strong expertise to back it up and 6:42continue to invest. There's other 6:44players too, right? We'll see where Meta 6:45goes. We'll see where Grock goes. It is 6:47not a world where there is just an 6:49iPhone. Imagine a world where there were 6:51five different iPhones. And that matters 6:53because part of what made the App Store 6:56successful was that it was the only game 6:59in town. That's not true for Open AI. It 7:02is not the only game in town. Now, I 7:05will say anyone at OpenAI is going to 7:07come back with a very reasonable 7:08response. Nobody else has 800 million 7:11users. Well, true, but Google has, I 7:14don't know, 400 and change with Gemini 7:16and growing fast. And Anthropic arguably 7:19has the super premium segment that 7:20iPhone traditionally represented because 7:22people are choosing to lean in and pay 7:24for it, which is part of what Anthropic 7:26was leaning into in their marketing this 7:28week. put on your thinking caps. They 7:30had popup cafes in New York and San 7:32Francisco. People waited in line. They 7:34got a thinking cap. And the clear 7:36implication was if you're a thinking 7:37person, you're picking anthropic, right? 7:39So, I think the book has not yet been 7:42written on where all of this is going to 7:44end up. But developers and builders have 7:47an unprecedented set of choices and are 7:50not super excited about a world where 7:52they lock in. One of the things that 7:55NADN can argue, the agent builder that 7:57arguably is most affected by OpenAI's 8:00launch of agent builder yesterday or the 8:03day before is that and offers pre-built 8:06connectors with a wide range of apps and 8:09it's model agnostic. You can bring in 8:11cloud, you can bring in OpenAI, they 8:12don't care. Well, Open AI isn't model 8:14agnostic, is it? Right? Like you're 8:16bringing in Open AI's models. That's 8:18what they want. That's why they built 8:19it. And so I think that the the question 8:21I have is in a world that is this 8:24multimodel are we really going to get 8:27excited about moving to a platform that 8:30is trying to lock us in as builders. Now 8:33let's say you're sitting in seuite. The 8:35question for you is increasingly do you 8:38listen to the developer side of the 8:40house and leave room for your technical 8:43teams to experiment across a range of 8:45models and options or do you get into a 8:47lock-in relationship with OpenAI because 8:49you trust the brand and I have literally 8:52sat and had conversations with leaders 8:54wrestling with that. And I think in 8:56light of that it is super interesting to 8:59see Azure's strategy with multimodel 9:02because Azure has very deliberately even 9:04though they have an investment in open 9:05AI right a big one they have decided to 9:08lean in on being multimodel if you want 9:12in Azure to get grock you can do it if 9:16you want to get claud you can do it if 9:18you want to get Gemini heck you can do 9:20that too Azure is not going to pick 9:21sides and that's very much Satcha's 9:23platform play right like Sachi Nadella, 9:25CEO of Microsoft, he's going to sit 9:28there and say, as long as you're getting 9:29Azure cloud, I don't care, right? You 9:31can get whatever you want. And that 9:33feels much more freeing if you're a 9:34developer because the value is in being 9:37able to flexibly choose. Now, I'm going 9:40to come back and ask you why, right? 9:41Like we have the choice like we could 9:43say that the choice is why, but there's 9:46another reason to this. The pace of 9:48development in this ecosystem is so fast 9:52that it seems irrational to most players 9:54to lock themselves in to a single vendor 9:58and that is what open AAI is inviting 10:00and I think that may generate some brand 10:03problems for them because I don't think 10:05developers want to be locked in. I don't 10:07think vibe coders want to be locked in 10:09and if asked I doubt consumers do 10:11either. I think chat GPT is winning 10:13partly because it is the Kleenex of AI, 10:16right? It is a brand name that has 10:18become synonymous with generative AI and 10:20that is part of how that long tale of 10:23free users is using chat GPT and not 10:25another tool. So another question I want 10:28to get to, we've talked through the 10:29whole developer builder thing. Fine. The 10:31other question that I want to get to 10:33that is underneath this larger story is 10:35a question around liquidity of tokens. 10:39So, for right now, there are three big 10:42games in town if you're developing or 10:44building. And OpenAI was out there sort 10:46of praising folks that spent a lot of 10:47tokens. But what's interesting is there 10:49were some people whose names were on 10:52that billboard saying, I spent a 10:54trillion tokens or 100 billion tokens or 10:56whatever it is who were saying publicly 10:57on X afterward, I don't want to be on 11:00that billboard. I want to spend less 11:02tokens. My job is actually to compute 11:05less so I am more efficient so I drive 11:07better margins for my business. I don't 11:09want to be on the board. Well, that's 11:11kind of the difficulty of being in a 11:13model maker position, isn't it? You are 11:16in a position where you're offering 11:17intelligence, but people know it's 11:19metered by the token. They know they're 11:22charged by the token. They don't want to 11:24pay more than they have to pay. And in 11:26that world, again, competition is 11:28helpful because you compete on price. 11:30And this is where I lean really into the 11:32Google Gemini side. Google having TPUs 11:35in their stack helps them to compete 11:37aggressively on price. And when they can 11:40compete aggressively on price, they can 11:42give everyone else in the business a 11:46price floor that they have to be honest 11:48about. And so open AI cannot charge a 11:52premium for their models beyond a 11:54certain point because people will just 11:56switch down to Gemini. And I think that 11:57in in that sense, what we're really 12:00talking about here is a world where 12:01OpenAI wants you to be incentivized to 12:05burn tokens. Model makers in general 12:07want that. They celebrate that. That was 12:09part of Dev Day, they talk about their 12:11token burn rate. But any given 12:13individual in the game wants to spend 12:15less tokens. Every every CTO I know, 12:18every CIO I know who I've talked to, 12:20they don't want to spend more on tokens. 12:22They want to spend less. It's like a 12:23cloud bill. You want to reduce it. If 12:25you want to reduce that bill, you're 12:27going to go both with the cheapest token 12:29per intelligence available and you're 12:31also going to see if you can compress 12:33your calls and do all the other prompt 12:35engineering stuff to make it effective, 12:37to make it cheap, to make it efficient. 12:39And that also undercuts OpenAI's play 12:41here, doesn't it? Because they want to 12:43sort of send out the message that you're 12:46just going to be spending more and more 12:47tokens with them because the future of 12:48intelligence is is compute. But at the 12:50same time, like it's not, right? At the 12:53same time, people are desperately and 12:54publicly trying to cut how much they're 12:57spending. Now, am I here to say that 12:58like I think that undercuts the demand 13:00story for AI? No. Because there's so 13:02much business demand growing. Like, I 13:04don't think that's what's going on. I 13:05think it's more about being efficient 13:06with your resources and recognizing that 13:08a company that's publicly celebrating 13:10token burn is a company that may not 13:14always have incentives aligned with 13:16yours. So, there's three scenarios for 13:18how this plays out, right? I've talked 13:19about some of the hidden stories here 13:22off of Devday. I've talked about some of 13:23what they released. I want to give you 13:25three ways this plays out. Scenario one, 13:28OpenAI wins and becomes the AWS of a AI, 13:31right? So developers are are effectively 13:34going to accept managed orchestration 13:36over API access. Um, and they will just 13:38work with Open AI. And this may be 13:40because they have a successful CTO play 13:42and they just get all these enterprise 13:44deals and that's how it works. Or it may 13:45be that they just release enough 13:47features and they are price competitive 13:49enough that they can keep developer 13:51loyalty. We'll see. In that world, stuff 13:53like the apps SDK just becomes standard 13:56infrastructure. Developers find it easy 13:58to build with it. They build a lot with 13:59it. This in turn reinforces the habit 14:01loop with consumers. Consumers start to 14:03spend more time there. The attention 14:05becomes more valuable. The spend becomes 14:07more valuable from consumers and it 14:08becomes this sort of virtuous feedback 14:10loop. This is what OpenAI wants, right? 14:12That's the future that they're hoping 14:14for. Open AI can then capture platform 14:16margins and not just the margins that 14:18you get from inference. Because one of 14:20the sort of underlying things here is 14:22that if you're having to live in a world 14:23where models are cheaper and cheaper, 14:25you want to not just be living and dying 14:27on the cost of inference compute. You 14:30want to monetize the platform with 14:32unfair economic advantage. And that's 14:34exactly the long-term play they're going 14:36for. And I'm not that they have the 14:38competitive advantage to actually do it. 14:40I think it's a bet and I don't know if 14:41it will work. So models are going to 14:42commoditize but open AI is basically 14:44going to stay a layer above the 14:46commoditization of models in that in 14:48that world right like it doesn't matter 14:50ultimately if you are talking to 14:51multiple models your compute layer your 14:54commodity layer where the developers are 14:55where the building is would be open AI 14:57so that's scenario one I don't know that 14:59that is particularly likely given the 15:01things I've talked about scenario two 15:03fragmentation wins developers will 15:06resist platform intermediation so 15:09developers will want to go straight to 15:10models and compute against them. They'll 15:12want their pick of models. Vibe coders 15:14and builders will want their pick of 15:15options. AI enthusiasts will want to be 15:17able to pick claude as well as OpenAI. 15:19And in that world, there will not be as 15:22much developer activity. From a consumer 15:24perspective, there will not be as many 15:26apps in chat GPT. and OpenAI will remain 15:29perhaps the largest player in the game, 15:31but they may not achieve what we would 15:33call platform economics where they can 15:35charge disproportionate 15:38cash for their competitive advantage 15:40because it's not that dominant. And so 15:42the market looks sort of like the 15:44database market in that world. There are 15:46many winners. There are not there's not 15:48sort of one dominant platform that owns 15:50everybody. Scenario three is a little 15:52bit more creative. Scenario three is a 15:55world where there's some kind of 15:59enterprise team up aside from OpenAI 16:03that leaves OpenAI with the consumer 16:05market but leaves them out of the very 16:07lucrative enterprise and developer 16:09market. That could look like a few 16:11different ways. It could look like 16:12Anthropic and Amazon teaming up. It 16:14could look like Google and Amazon 16:16teaming up. There are some plays where 16:18where Apple teams up with Anthropic. If 16:20you notice everybody's trying to team up 16:21with Anthropic, that is a theme. So 16:23we'll have to see. I think this is the 16:25least likely scenario. Like if 16:26fragmentation is the most likely, I 16:28think this one requires some very 16:30complex merger and acquisition and sort 16:32of corporate alliances that would have 16:35to be delicately negotiated. It would be 16:37a world where like you have anthropic on 16:40somebody's cloud along perhaps with 16:42Gemini and enterprises get to choose the 16:45best of breed. I think this is the world 16:47that Microsoft wants to create. 16:49Essentially, an integration world is a 16:51world where the platform economics 16:53remain with cloud providers and Azure 16:56wants to be in place to pick up those 16:58dollars and so does Google Cloud. And 17:01so, one of the things that sort of comes 17:03to mind for me as I think this through 17:05is that, you know, Jasse at Amazon, the 17:07CEO of Amazon was asked why Google Cloud 17:11and why Azure are growing faster than 17:15AWS right now. And he let slip that he 17:17thinks it's because of AI. And of 17:18course, Amazon kind of took a bath in 17:20the markets as a result, but it's kind 17:22of true. Google Cloud and Azure are 17:24basically trying to play for this 17:26integration play where they can say pull 17:28all the models together. Enterprise can 17:30choose best of breed. No one model maker 17:33gets to own the relationship directly 17:35with the enterprise. The cloud provider 17:38owns the relationship with the 17:39enterprise. So, we will have to see. I 17:42think that one's somewhat less likely 17:44because I think the power of these model 17:46makers continues to grow, shift, evolve 17:50as the models become more and more 17:52capable and model makers are becoming 17:54savvy enough that they're offering some 17:55things directly to enterprises through 17:58direct deals. And so it's it's a 17:59complicated relationship and I think the 18:01number of stars that have to align for 18:03an integration plays a little bit more. 18:05So if I had to be a sort of a betting 18:07man, right, and I had to handicap this, 18:09I would say fragmentation winning, maybe 18:12not a coin flip, but close to a coin 18:14flip, call it 45%. AI becoming sort of 18:17the AWS and sort of their bet today or 18:20yesterday or this week wins, I handicap 18:23it at 25 to 30%, like I think maybe a 18:25third of a chance roughly, maybe 18:27slightly less. And then the remainder 18:28would go to sort of that integration 18:30play, right? Like call it maybe 20% and 18:33change. And that's and those are the 18:35three scenarios, right? I think that's 18:36what we're looking at. If you are let's 18:38let's do takeaways to close this out, 18:40right? If you are a builder, your 18:42takeaway 18:43is to build aggressively into all of the 18:46spaces you can. If you can build with 18:49OpenAI's new app store and it takes off, 18:51you're in a position to really do well. 18:53If you can build with agents, you're in 18:56a position to do really well. If you can 18:58build with clawed code and that's 19:00cheaper and that's more effective and 19:02you can get more done, you're in a 19:03position to do well. It's a builder's 19:05paradise right now. I will then extend 19:07that to anyone who is in sort of the AI 19:10enthusiast category who's listening to 19:11this. If you're passionate about AI, 19:13which I think most of you probably are 19:14if you're listening to this, especially 19:16if you're listening to the end of this 19:17video, which is a long way down, your 19:19chance is now to differentiate yourself 19:22from the competition. And by the way, 19:24your competition is not some hot shot 19:2622-year-old developer in Palo Alto. Your 19:29competition are the non-technical folks. 19:31And so you may feel like you don't have 19:33the technical skills, but your ability 19:36to persist through and say build an 19:37agent with agent builder in OpenAI or 19:40maybe build it with NAD if you don't 19:42want to get locked in to to OpenAI 19:44standard. That is already worlds better 19:47than most folks who are dealing with AI 19:49right now. I just did a video on AI 19:51fluency. Being able to do that is a 19:53whole lot better from where most folks 19:55are. So this is also your chance to be a 19:58builder. Even if you think you're not as 20:00good as the coders, don't worry about 20:01it. You you are doing great exactly 20:03where you are. If you are in seuite, you 20:06should be very carefully thinking about 20:08the pros and cons of the model 20:11investments you make. You should be I I 20:14think prudently planning for a 20:16multimodel world. You should not assume 20:18despite all of the noise of Devday that 20:21Devday means that you should bet on 20:23OpenAI. I think OpenAI is absolutely a 20:25player. I would argue it's probably the 20:26biggest player at the table. I send 20:28people to look at the new responses API 20:30all the time. I think it's great, but I 20:32don't think it's the only game in town. 20:34And I think that any prudent CTO, any 20:36prudent CEO should be leaving options 20:39open given the pace of this AI race. So 20:42there you go. Most people are talking 20:43about Dev Day like it's a tour to force 20:46victory for OpenAI. I think there's more 20:48to the story and uh I hope you got a 20:50little bit more of it. If you want to 20:51dig in further, I have a whole lot more 20:53in the article. I have a prompt for like 20:55reading through it and thinking through 20:56it in your way and kind of what you 20:57need. Go have fun.