Learning Library

← Back to Library

AI-Powered Browsers with Ben Goodger

Key Points

  • Ben Goodger, a veteran of Netscape, Mozilla, and Google Chrome, now leads engineering for OpenAI’s AI‑powered Atlas browser.
  • Atlas is designed to look like a familiar traditional browser while embedding ChatGPT‑style assistance at its core, making the web experience more intuitive and intelligent.
  • Over the past 18 months, Ben and his team have clarified that the product should blend advanced AI capabilities with user‑friendly design, focusing on security and practical workflow improvements.
  • The recent Mac release of Atlas marks the first public step toward “AI‑first” browsing, showcasing how conversational AI can transform how people search, interact, and get work done online.
  • The conversation also touched on broader implications for the future of browsers, including new security considerations and the evolving nature of digital work in an AI‑enhanced environment.

Sections

Full Transcript

# AI-Powered Browsers with Ben Goodger **Source:** [https://www.youtube.com/watch?v=8tfqsGDCCb4](https://www.youtube.com/watch?v=8tfqsGDCCb4) **Duration:** 00:24:45 ## Summary - Ben Goodger, a veteran of Netscape, Mozilla, and Google Chrome, now leads engineering for OpenAI’s AI‑powered Atlas browser. - Atlas is designed to look like a familiar traditional browser while embedding ChatGPT‑style assistance at its core, making the web experience more intuitive and intelligent. - Over the past 18 months, Ben and his team have clarified that the product should blend advanced AI capabilities with user‑friendly design, focusing on security and practical workflow improvements. - The recent Mac release of Atlas marks the first public step toward “AI‑first” browsing, showcasing how conversational AI can transform how people search, interact, and get work done online. - The conversation also touched on broader implications for the future of browsers, including new security considerations and the evolving nature of digital work in an AI‑enhanced environment. ## Sections - [00:00:00](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=0s) **AI‑Powered Browsers with Ben Goodger** - Nate interviews OpenAI’s Ben Goodger, former Netscape and Chrome engineer, about the history of web browsers and the future of AI‑driven browsing technology. - [00:04:43](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=283s) **Rapid Code Understanding and Prototyping** - The speaker outlines how tools like CodeX/Atlas help engineers quickly grasp large codebases, prototype ideas to assess viability, and directly implement features, exemplified by a conversational review of a GitHub repository. - [00:08:08](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=488s) **Agentic Chromium Architecture for Speed** - The speaker explains their approach to building a browser that blends familiar UI with innovative, agent‑driven features by running Chromium as an out‑of‑process service to achieve rapid startup and secure input synthesis. - [00:11:41](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=701s) **Surprising Post-Launch Agent Use Cases** - The speaker highlights unexpected personal and work applications of the new AI agent—including automated online shopping comparisons and rapid generation of Google Forms—showcasing its ability to streamline tasks that were previously cumbersome. - [00:15:08](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=908s) **From Boxes to Conversational Interfaces** - The speaker reflects on how the early web transformed software delivery from physical boxes to instant clicks, and how modern LLMs further simplify interaction by letting users speak natural language to resolve everyday ambiguities, making technology feel like a helpful friend. - [00:18:32](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=1112s) **Role Blurring: Engineers as Product Designers** - The speakers discuss using LLMs to synthesize information across tabs and how their team’s engineers now act as product engineers—owning feature design, user research, and feedback analysis—reflecting the merging of traditional roles. - [00:22:47](https://www.youtube.com/watch?v=8tfqsGDCCb4&t=1367s) **Atlas Launch: Voice and Chat Memory** - The speakers discuss integrating voice interaction and persistent chat memory into Atlas to make browsing more conversational and personalized, emphasizing how these features differentiate the product. ## Full Transcript
0:00This is a good one, guys. I got to sit 0:01down with Open AAI's Ben Goodger, the 0:04lead engineer building the Atlas browser 0:06for OpenAI. We had a really wide-ranging 0:08conversation. We talked about the 0:09history of browsers. Ben, of course, has 0:11been involved in building Chrome, been 0:13involved in building Netscape Navigator 0:14for those of you with gray hairs, and 0:17now he's taking the lead on building the 0:19AI powered browsers of the future at 0:21OpenAI. So, I had a lot of fun. We 0:24talked about what the future looks like. 0:25We talked about security implications. 0:27We talked about the way we get work done 0:29and how that's changing. It was a super 0:31wide-ranging conversation. I put a cut 0:33up here uh on YouTube and if you want 0:35the full cut, of course, you can go over 0:36to Substack. Uh enjoy and uh yeah, let's 0:40talk about AI powered browsers. Hey all, 0:42I'm Nate and I have a special guest with 0:45me today. Uh Ben, why don't you 0:47introduce yourself? 0:48>> Hi, I'm Ben and I am the head of 0:50engineering for Chat GPT Atlas here at 0:53OpenAI. 0:55And how did you get to be head of 0:56engineering at JIGBT Atlas? 0:58>> I've always been interested in the web, 1:00web browsers, uh going back to I guess 1:02like the mid to late 1990s when the web 1:04was first developing. Uh I was like a 1:07hobbyist web developer building sites 1:09just for fun. Uh and early in my career 1:12I got involved with Mozilla was open 1:13source project made contributions to 1:15that ended up getting hired by Netscape. 1:18Uh and so I came I was grow I grew up in 1:21New Zealand. Uh I I moved up to to 1:24Silicon Valley for a period of time. Got 1:26to see Netscape in its final days which 1:28was an interesting experience. But ended 1:30on ended up moving on to Mozilla where I 1:33helped work on Firefox the first version 1:34of that. Uh and then moved to Google. I 1:37was at Google for nearly 20 years. 1:39Helped build the Chrome browser there. 1:42And then about um almost 18 months ago I 1:45came over to OpenAI. uh is very 1:47interested in exploring what the web 1:50would look like if you see it through 1:52the eyes of maybe having an assistant 1:54like chat GBT really at the core of the 1:56browsing experience. Uh and so since 1:59then have been trying to to build that 2:02and we built uh we shipped our Atlas 2:04product for a Mac a couple weeks ago 2:06which is really exciting. I am really 2:09curious to hear, you know, you've shared 2:11a bit about what made it compelling for 2:12you to come to OpenAI, the sort of the 2:14piece of having intelligence on the web. 2:16What is it that you felt like you 2:19learned or that crystallized for you 2:21along that journey over the last 18 2:23months that gave you a sense of clarity 2:25about what you wanted Atlas to be and 2:27where you wanted to take Atlas? I think 2:29that, you know, the Atlas product, you 2:31know, at face value, it kind of 2:32resembles a traditional browser. And I 2:34think that that's important because 2:36everyone knows what a browser is. 2:38Everyone uses a browser on a on a, you 2:40know, pretty frequent basis. So, I think 2:41there was an aspect of we need to build 2:43something that people can understand. 2:45Um, even as we try and bring really 2:48advanced capabilities into it. Uh and 2:50then the other thing that I've learned 2:52is that the pace of development of the 2:55the tech that is happening here at 2:57OpenAI is just so incredibly fast uh 3:00that even some of the you know 3:02limitations that you might see you know 3:04one month you know won't be there the 3:06next month. So just you know as we've 3:07built features like agent seeing that 3:09come together seeing that get much much 3:10faster get much more accurate um at at 3:13clicking on things and doing stuff for 3:15you on the web. Uh, so I I I've learned 3:17to be more um uh you know, I'm an 3:21optimistic person by nature, but I like 3:22even more optimistic about about where 3:24this tech is going and what types of 3:26product experiences that that will 3:27enable. Uh what is it about your working 3:31style or the team's working style that 3:33has shifted over the last 18 months as 3:35you've been working on building this 3:36browser? Now you know for me when I 3:38joined uh you know I'm the team manager 3:41but you know also like when I was the 3:42first person here working on this I was 3:44also writing you know code doing 3:45prototypes that type of thing uh I used 3:48chat tbt early on extensively to help me 3:51learn new programming languages and like 3:53really get up and running again like but 3:55then as we had more and more engineers 3:56join the team and then we've also 3:58launched products like codeex we've gone 4:00on the evolution of codec especially 4:02codec cli I would say in the past couple 4:05of months even codeex is like really 4:07changed the way in which we work and uh 4:09what we see is that people that are 4:10using codeex are just like so much more 4:13productive and I think there's an aspect 4:14to codeex which is you know it allows 4:16people who don't code that much to do a 4:19little bit of coding but it also helps 4:20very experienced engineers get way more 4:23done and I think that's what I'm really 4:25excited about uh because a really 4:26experienced engineer can kind of steer 4:28codecs and and and be like just 4:30monstrously productive 4:33is that you just sort of situation where 4:35you see engineering productivity 4:37patterns starting to shift toward 4:38multi-threadedness. 4:40One is to sort of understand how 4:43software that exists works and another 4:46is to um prototype a new idea to see if 4:49it is you know sort of the juice is 4:50worth the squeeze uh if you like. Uh and 4:52then lastly it's to actually get the 4:54work done um for for implementing the 4:56new feature. uh we work in these large 4:59code bases sometimes actually usually 5:01the documentation isn't where it needs 5:02to be or the documentation is out ofd or 5:04that that type of thing you know codeex 5:06or other similar tools can read much 5:09more quickly than you can and give you 5:11like a pretty good answer very fast so I 5:13have an idea for the product and I think 5:14well maybe this is like super 5:15interesting and I could spend a bunch of 5:17time on it I can throw together a 5:18prototype and codeex and decide hey this 5:20doesn't quite work the way I want it to 5:22uh and so maybe I'll I'll choose 5:24something else uh to focus on and then 5:26lastly you know some of our most 5:28experienced, most productive engineers 5:30are just using it to build features uh 5:32in the drive. You know, either 5:33refactoring code or just building like 5:35the front-end code itself or you know, 5:37any any aspect of the feature really. I 5:40think that gets a use case I actually 5:42had for Atlas today. Um, which was 5:44really interesting to me. Uh, I was 5:47looking at a GitHub repo and I was 5:50running through and I was like, I want 5:51to get a sense of what's in this GitHub 5:53repo really quickly. What if I just have 5:56Atlas look at it and I put the assistant 5:59up on the side and I just have a 6:00conversation with Atlas about the repo. 6:05And what I found was there was this sort 6:07of magic that happened when I was in 6:11Atlas because I was working with sort of 6:14that uh there's sort of a magic to code 6:18comprehension I'm finding with some of 6:19sort of the way Chad GPT like touches 6:21and plays with code. it was able to like 6:23click through, take control of the 6:24screen, look at all of the different 6:26files in the repo, and it came up with 6:28some really, really thoughtful questions 6:31that enabled me to get much more 6:32fingertippy with the code much more 6:34quickly. It was one of those magic 6:36moments for me. So, I think the browser, 6:38it's it's so simple, it almost sounds 6:40silly, but like reducing the friction 6:42for a user to access some of this magic, 6:44it can feel like making the magic 6:46available like in the first place. You 6:48know, I've always been able to take a 6:49web page, maybe you could print it as 6:50PDF or you could copy and paste it and 6:52paste it into a into a into chatbt, but 6:55that's just like a few extra steps. And 6:57when you can just bring this up, you 6:59know, in situ and ask the question 7:01directly, it's it's like it, you know, 7:03suddenly it's it's there. One of my 7:05favorite use cases has been shopping. 7:07You know, I could do a lot of online 7:08shopping when I'm not not working. 7:10Sometimes I'd be looking at a product 7:11and I can ask the sidebar, is this sort 7:14of the best price that I can find on 7:15this thing? And there, you know, when 7:17paired with our search agent, like it 7:19will go off and like browse the web and 7:21find out if there really is that is the 7:23best price or if there's a better deal 7:24on it somewhere. I had one case where it 7:26actually it hit really well. I was 7:28looking at a pair of shoes uh and it 7:30found the shoes available from a 7:32different site for about $60 less. Wow. 7:35>> And that was one of those like really 7:37kind of wow moments. 7:38>> Yeah. I there's something around habit 7:40formation where when you get that 7:42dopamine hit of wow, this is really 7:43easy. this is something that I didn't 7:45realize I could do. What were some of 7:47the things that your team really had to 7:48wrestle with around trade-offs and 7:50decisioning to make that browser come to 7:53life? One is the product design itself 7:56and then the other is the the technical 7:58infrastructure and all the magic that 8:00went into it. So with the product um we 8:03wanted to design something that was 8:04really useful but also felt familiar. 8:07Like I said everyone kind of knows what 8:08a browser is. Um, we had, you know, lots 8:11of debates about how we should design 8:13features like basic aspects of the 8:15browser. I think there was a lot of room 8:17for innovation in those areas, but it's 8:19also a bit of a double-edged sword. So, 8:21we've tried to find a balance between 8:23something that feels familiar uh and yet 8:25still has has some improvements for for 8:28folks that are, you know, looking to 8:29find more efficient ways to get stuff 8:31done. And then on the technology side, 8:34uh this is where you know we spent a lot 8:36of time both on the more traditional 8:38browser infrastructure building on 8:39chromium as well as how we built some of 8:42the more cutting edge features like 8:43agent. We want to build a product that 8:45feels very fluid and fast. Uh and we 8:48also wanted to build a very cutting edge 8:50kind of product user experience and just 8:52sort of the standard way in which a lot 8:54of Chromium browsers are built just 8:57don't make that super easy to access. So 8:59we built a a a unique way of holding 9:02Chromium. It's almost sort of agentic uh 9:04in in form. We run Chromium as an 9:07outofprocess service so that when you 9:09start Atlas, you're not actually blocked 9:11on Chromium starting up. So the browser 9:14can start very very fast. Uh and sort of 9:17takes however long it takes to start up. 9:19And then when you run a feature like 9:20agent, it is doing things like 9:22synthesizing input events to click on 9:24things. um we're able to do that in a 9:28very robust and secure way. So there's a 9:30bunch of stuff uh like that that we've 9:32done. We posted a we did a technical 9:33blog post recently that uh that covered 9:36this in some more detail. But yeah, 9:37we're pretty excited about that. Yeah, I 9:40recommend if if folks listening have not 9:42read the technical blog post, if you're 9:44at all technically minded, it's a super 9:46interesting read. I'm I'm sort of 9:48curious given your experience and sort 9:51of all the different browsers you've 9:52worked on, to what extent does Atlas 9:56feel like a 10:00fully solved problem, feel like a 10:02partially solved problem? Are there 10:04pieces of it that you're really excited 10:05to dig your teeth into next? Uh, yeah, I 10:08I I think this is really the first step 10:10in a in a long journey. Like when I talk 10:12to people about this, you know, if I'm 10:14talking about browser history, I would 10:16say this is like the Netscape 1.0 moment 10:19for this new era of like Agentic 10:21browsers. Um we're excited to um get get 10:25the agent feature out there. Uh but it's 10:28also very much sort of a research 10:29preview. We are discovering use cases 10:32for it. Uh we're also like really 10:34enthusiastic to hear how other people 10:37are using it and and and we expect to 10:39make a lot of improvements uh both to 10:42the you know the the experience the sort 10:44of accuracy speed. Most most folks uh 10:46haven't seen this type of functionality 10:48in a browser before uh and so we think 10:51it's really important to uh bring people 10:54along you know with with this with this 10:56thing. You're very used to clicking on 10:57things yourself. So having a tool that 10:59can do that on your behalf um you know 11:01is both exciting sometimes maybe a 11:03little intimidating. So we want to like 11:05be very clear about how the product 11:06works. So the first time you use this we 11:09give you sort of this nice disclosure 11:10that tells you you know what what its 11:12capabilities are and we give you some 11:13options too. You can choose for example 11:16uh if you want the agent to run with 11:18your logged in sites just like it really 11:20was browsing as you you can make that 11:22choice or you could choose for example 11:24to have it run logged out and then you 11:26have to be very explicit about what 11:27sites you want it to log into and you go 11:29and log into those yourself and so 11:31there's choices like that for example 11:32that help people you know have whatever 11:35their comfort level is uh to apply that 11:37to their use of the product. 11:39>> Yeah that makes a lot of sense. Is there 11:41a use case that you heard from the wild 11:44post launch that was most surprising or 11:47interesting to you? Well, there's stuff 11:48that I think yeah, super cool. Like I 11:50said, I was like an online shopping uh 11:52fiend and I'm like, "Wow." Like, I used 11:55to have to go and have a different tab 11:56where I'd go and search for those things 11:58and then go and try them back one by 12:00one. And so, the fact that you could 12:01just kind of set this and forget it and 12:03then it would try them all and then pick 12:04the best one, that was pretty pretty 12:06cool. Um, yeah. And then like not 12:09personal use case but more like work use 12:11case. Um you know I obviously had been 12:14diving through a lot of um feedback for 12:17from our launch the other week. So I 12:19wanted to come up with like a quick 12:20survey. Uh and so I I had this 12:23discussion with chat GPT about some good 12:24good questions. Uh and then I asked the 12:26agent to go off and make a Google form. 12:29Uh so I could do the survey. And you 12:31know I've used Google forms a bunch. It 12:32tends to be a bit finicky, the different 12:34types of question formats and other 12:36stuff, but Agent just figured it all out 12:38for me. And you know, a few minutes 12:39later, I came back, I had this form, I 12:41could just publish it right away. I can 12:42think of a lot of government websites 12:46that are very frustrating to use that 12:47feel like they were built in the 1990s 12:49that for me fall into that category. Um, 12:52yeah. So then if we switch modes a 12:54little bit about the security side of 12:56things and how you think about security 12:58with LLMs in the browser specifically 13:00and there are divergent opinions ranging 13:03from this is an impossible solve to this 13:06is tractable to we can make progress in 13:08this area. How do you think about it as 13:10a problem space? 13:12>> Yeah. Yeah. I think this is interesting 13:14for for agent which is like a net new 13:16capability and and both the technologies 13:18evolving. We we expect to do a lot of 13:20work on it uh you know over the next 13:23while. Um I talked before about um just 13:26sort of the onboarding and some of the 13:27choices you have about uh you know what 13:30what sort of site access it has if it's 13:33uh authenticated or not. There's a few 13:35other mitigations that we have in place 13:37like if it's going to do something 13:38sensitive uh on your behalf like be in 13:40your email. Uh yeah it's going to want 13:43you to watch it. Now I the analogy I 13:45give for this is I have a car that has 13:47this sort of auto drive functionality. 13:49I'll be there on the highway and I can 13:50turn on the the the cruise control and 13:52it will you know even take the wheel and 13:54steer it a little bit uh for me. But it 13:56in in return what it wants is that I 13:58keep my eyes on the road uh and it has 14:00this little camera in the dashboard 14:02somewhere that like will will shut it 14:04off if I'm not paying attention. And so 14:07similarly in in Atlas, if you were 14:09having the agent do work in one of these 14:11sensitive contexts like your email, uh 14:13it wants you to be on that tab paying 14:15attention. If you switch away, it's 14:16going to stop. And then another thing 14:18that we have is like if you ever been in 14:19a machine shop and worked with a big 14:21piece of lathe or something else, 14:22there's usually like a big red button 14:23somewhere that if it starts doing 14:25something you don't want to do, it's 14:26very clear. You hit the red button and 14:27it stops. And so the agent has that, 14:29too. Uh and so if you see it doing 14:31something that you don't want it to do, 14:32then you just hit stop. Um these are 14:35good tools uh to have and I think will 14:36help people get confidence that they you 14:39know you're the one that's still like in 14:41control um of how it works. Yeah, I 14:43think that makes a lot of sense. I think 14:45that this is a moment where we have a 14:46it's new again um and we have the 14:49opportunity to revisit a lot of these 14:50sort of foundational primitives. I think 14:53that sort of brings me to an interesting 14:55question. It feels to me like there's an 14:57opportunity for shifting the browser 15:00experience further. But I I'm curious if 15:02you turn on the high beams, to use the 15:04car metaphor. What is what does that 15:05look like for you? 15:07>> One of the things I thought was special 15:08about the early web was the fact if you 15:10go back to that time in the 1990s, you 15:12think about how people got software, it 15:15was to go to a store, you drive to a 15:17store, you buy a box of a product with 15:19some shrink wrap, and you take the discs 15:21out and you install it. Uh, and then you 15:23think about the web where you just click 15:24on stuff and it comes up on your screen. 15:26Like that was pretty magic. And then 15:28that aspect of using the web where you 15:30just go from site to site. It actually 15:32what resonated with me was that it felt 15:35kind of like how my mind worked. Now if 15:38you fast forward to LLMs today, it is an 15:41even more accessible I can just talk to 15:43this thing uh and this thing will figure 15:45out kind of what to do. That's maybe 15:47like an idea of what the the future 15:49looks like where instead of having to 15:50dig through a bunch of settings menus, 15:52you can just tell the system what you 15:54want from it and it will just figure out 15:56how to do it. Uh, and then if I sort of 15:58extrapolate from there, I actually think 15:59like a lot of what people struggle with 16:01on a day-to-day life is is um in their 16:04day-to-day lives is ambiguity. Um, like 16:06there's this thing that I want, but I 16:08don't quite know how to do it. When I 16:09first started using ChatgBT, it was kind 16:11of like a friend. it would say, "Oh, if 16:13you want to do this, you should, you 16:15know, think about taking one of these 16:16three steps." For something that, you 16:17know, is about betterment of yourself, 16:19maybe, you know, you should go do those 16:20things. But for a lot of things that are 16:22more tactical, like it would be great if 16:24your agent could just go off and do that 16:26thing for you and report back on the 16:27status of it. And so, I think, you know, 16:30maybe there's some version of the 16:31future, but even though I say that, I 16:33also think that people will continue to 16:35browse the web themselves because 16:37there's stuff that as humans we want to 16:38do. We want to be entertained. We want 16:40to create things. Yeah, that that's a 16:42really sort of rich area to dive into. 16:44It felt like there were sort of two big 16:47buckets that browser work falls into. 16:50There's the delight bucket where you're 16:52you're trying to learn, you're trying to 16:54be curious, you're you want to be 16:55surprised and then there's the oh gosh, 16:58I don't want to do this and I would 16:59prefer to avoid it and could someone 17:01please take care of that part for me? 17:02Mhm. 17:03>> And as I was reflecting, 17:06it's pretty easy to make the case from 17:08an agent perspective that the not fun 17:11part is something ideally in a perfect 17:13world you'd want the agent to just go 17:14and take care of for you. But the fun 17:17part to imagine is how could an agent 17:20also enrich that delightful side of 17:23things. 17:23>> Yeah, totally. Like I I I definitely 17:25think you know that trying to reduce 17:27toil is like something that we 17:29definitely want to support. browsers 17:31have been evolving for 25 30 years. What 17:34does it look like to take the next step 17:37in that process and what is the 17:39direction the trajectory that is 17:41changing? And I find that super 17:43fascinating because I think we are at an 17:45inflection moment. Um, you know, when I 17:48think about the early web, you know, a 17:50lot of the focus back then was just, you 17:52know, helping people understand those 17:54those links uh that are out there and 17:56clicking on links and going from place 17:57to place. We could go and find like 17:59collections of links that people had an 18:00opinion about if they were good or not. 18:02As the web scaled that stopped working 18:04and then you got search and then search 18:05was transformative because it helped you 18:07find like the little piece of 18:08information or the website that you 18:09wanted to go to. People started building 18:11these rich apps and then as you point 18:13out yeah inflection point now we're in 18:15this new stage. We like made the way in 18:18which we can interact with this 18:19technology just radically more human and 18:22we can scale up the capabilities of the 18:24platform and the platform will be able 18:26to do things on your behalf. I think 18:27this is really the third phase uh of the 18:30web. 18:31>> Yeah, it's super exciting. 18:32>> There's some some UX that we're 18:33exploring around it a few different 18:35ways, but there's some compelling use 18:37cases like I think I I ran into this 18:38before launch when I was trying to um 18:41you just sort of synthesize a single 18:43document from a bunch of different 18:44sources that were opened in tabs and 18:46then it wouldn't work. So, I kind of 18:47wanted I don't think it's even a fixed 18:49number of tabs. It's like I wanted you 18:51know n basically you know subject like 18:53context window size. Um but but yeah, 18:57>> we talked about ways of working and how 18:58the team is using codecs a little bit. 19:01One of the things I've been hearing 19:02really consistently from small 19:05companies, large companies, individuals 19:06is that it feels like roles are blurring 19:09as we lean into LLMs more and more and 19:12more. And I'm curious if you look at the 19:15roles on your team and how they're 19:17evolving as you guys work, you know. 19:20Yeah. So um the 19:24every engineer on our team is a product 19:27engineer basically. This is someone that 19:28like really thinks about the breadth of 19:31the the experience. Every engineer on 19:33the team is empowered to to sort of own 19:35the design and development of a feature. 19:37And so all of our engineers are talking 19:39to users uh reading feedback you know 19:42figuring out how to integrate it into a 19:43future update. Um we have a bunch of 19:45tools. not super familiar with like all 19:47of the tools that we use, but I know 19:49that we use a bunch of, you know, LLMs 19:52basically to help us sift through 19:53feedback, get common themes, you know, 19:55that sort of stuff. So, that will help 19:57us, 19:57>> you know, understand like what the top 19:59pain points are uh that people are um 20:02talking about at scale, you know, 20:04through our user support forums and 20:05stuff like that. 20:07>> Yeah. No, it makes sense. If we pivot a 20:10little bit uh sort of back to the 20:11future, I would like to see this 20:13breakthrough happen or I would like to 20:14see this um technical challenge solved 20:17and then we will unlock this new 20:18experience. What pops out to you as 20:20significant milestones in the next call 20:22it 18 months to two years where you're 20:25really excited to see something unlock 20:27as we get to a particular capability. 20:32I think people will get more used to 20:34this more accustomed to this 20:36functionality and so they will seek to 20:39do more with it. That is from the 20:41customer side and then on the product 20:43side we will make that sort of magic 20:46more more real uh and sort of help you 20:48figure out the opportunities uh to to 20:50leverage that magic. Um the capabilities 20:54will continue to increase at a 20:56breathtaking pace. And so what I want to 21:01end up in is a place where 21:04uh you can really give this tool fairly 21:08ambiguous complex tasks and it will 21:11break it down and figure out how to make 21:13progress on your behalf. I used the word 21:14toil before. You know, how do we 21:17>> get rid of some of that that annoyance 21:19and and and make it just sort of more 21:22reliable, simple, trustworthy? uh you 21:25know these are these are things that 21:26that we want to do. 21:29>> I one of the things we haven't talked 21:31about that I'd be curious for your take 21:33on so much of our browsing happens on 21:35the phone. Are you guys thinking in 21:37terms of mobile? Yeah, that's another 21:39request that's coming in a lot. Uh we're 21:41trying to figure out the best way uh to 21:43bring this functionality to mobile. I 21:46think one of the observations that we 21:48have is that the way people interact 21:51with the web is a bit different on 21:53mobile. Um, so on the desktop platform, 21:55you know, the browser is like an 21:57embedded operating system. You know, all 21:59of your favorite apps for the most part 22:00are in the browser. Whereas on mobile, 22:02the mobile operating system itself is 22:04kind of like the browser. And so people 22:06tend to have, you know, relationships 22:08directly with specific apps. Uh, but 22:11then, you know, for the web, there's a 22:13couple of different use cases. One is I 22:14want to go and read a specific website 22:16uh that I don't have an app for. So then 22:18the the browser form factor makes a lot 22:20of sense. So I think we're just we're at 22:22a stage where we're figuring out like 22:23how we want to make browsing work uh on 22:26on the mobile device given these 22:27different use cases. The way mobile 22:31works 22:33implies dramatically different usage 22:36patterns. What it looks like from a 22:39small screen perspective to have the 22:41chat assistant there and the browsing 22:44experience. It's super interesting 22:45challenges to get into. 22:46>> Yeah, there's a bunch of also 22:47interesting stuff. think like voice is a 22:49very compelling u modality where if you 22:52load a page that you can ask follow-up 22:54questions and have all of that work. Uh 22:55I think we just need to figure out the 22:57right way to build that. 22:59>> And you know I I would be remiss if I 23:02didn't dive into one of the more 23:03interesting sort of the more interesting 23:06features that that that you've launched 23:08with Atlas that uh and that is that you 23:10guys bring in the chat GPT memory from 23:13previous chat GPT conversations and that 23:16is part of the browse experience. Uh, 23:17I'd be curious for like the product 23:19decisioning there and then how you how 23:22you think about that as an asset to the 23:24browsing experience and what that looks 23:26like. Well, the chat memory feature of 23:29Chat GBT is like an incredibly powerful 23:31one. Uh, and it means that like as you 23:33move from chat to chat that you don't 23:34always have to start from zero, but in 23:36some sense like makes it feel like it 23:37knows you um a bit better in that way. 23:40And that's a really interesting way in 23:43which the browser becomes more useful 23:45the more you use it. So, 23:47One thing I always love to do as we sort 23:49of bring this conversation to a little 23:51bit of a close. If you could say one 23:53thing that you feel like ah people 23:54didn't quite get that piece of the 23:56launch, I'd love to try and say it again 23:57and emphasize it. What would that be for 24:00you for Atlas? 24:02>> So for Atlas, I think that this is a it 24:04is a familiar tool but with this amazing 24:07new set of capabilities. 24:09>> Uh and so I encourage you to go and try 24:11it out for a whole bunch of different 24:14things. I would say even challenge 24:15yourself to ask more questions of it or 24:18things that even if you were thinking oh 24:21I don't need to ask that just try it out 24:22and see what it does I would say this is 24:24the beginning of a journey for us to to 24:26build this type of app uh we push a new 24:29build every week and so as we hear more 24:31feedback from you as you try it out and 24:33and do a lot of different interesting 24:35things uh we will make it better and 24:37better 24:38>> well thank you Ben thank you for coming 24:40and chatting 24:41>> yeah it's been great awesome I 24:42appreciate