Learning Library

← Back to Library

Goldman Sachs Report, AI Coding Tools, Music AI Lawsuit

Key Points

  • Goldman Sachs released a stark report questioning the near‑term value of generative AI, contrasting its earlier optimistic claim of a 7% GDP boost with a now‑skeptical outlook that has sparked debate among the panelists.
  • Developer Pietro Schirano launched “Cloud Engineer 2.0,” adding a code editor and execution agents to a command‑line tool, highlighting the next evolution of AI‑assisted coding and prompting discussion about who leads the Anthropic vs. OpenAI race.
  • Panelists praised Claude Engineer’s goal‑oriented, agentic design as a glimpse of the industry’s future direction toward autonomous, task‑driven AI agents.
  • The RIAA sued generative‑AI music services Suno and Udio for alleged mass copyright infringement, raising questions about how copyright law will shape AI training data and the broader creative‑AI ecosystem.

Sections

Full Transcript

# Goldman Sachs Report, AI Coding Tools, Music AI Lawsuit **Source:** [https://www.youtube.com/watch?v=Wlf6id2FH-Q](https://www.youtube.com/watch?v=Wlf6id2FH-Q) **Duration:** 00:31:14 ## Summary - Goldman Sachs released a stark report questioning the near‑term value of generative AI, contrasting its earlier optimistic claim of a 7% GDP boost with a now‑skeptical outlook that has sparked debate among the panelists. - Developer Pietro Schirano launched “Cloud Engineer 2.0,” adding a code editor and execution agents to a command‑line tool, highlighting the next evolution of AI‑assisted coding and prompting discussion about who leads the Anthropic vs. OpenAI race. - Panelists praised Claude Engineer’s goal‑oriented, agentic design as a glimpse of the industry’s future direction toward autonomous, task‑driven AI agents. - The RIAA sued generative‑AI music services Suno and Udio for alleged mass copyright infringement, raising questions about how copyright law will shape AI training data and the broader creative‑AI ecosystem. ## Sections - [00:00:00](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=0s) **AI Futures: Finance, Coding, Copyright** - The episode previews three hot AI stories—a Goldman Sachs report questioning generative AI’s value, Pietro Schirano’s Cloud Engineer 2.0 coding assistant, and the RIAA’s lawsuit against AI music startups over copyright infringement. - [00:03:17](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=197s) **Debating AI Risk Estimates** - The speakers reflect on the recent generative‑AI hype, reference Acemoglu’s claim that only 5% of tasks are truly at risk, and argue whether that estimate under‑ or over‑states the technology’s broader economic impact. - [00:06:21](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=381s) **Rapid Evolution of AI Landscape** - The speaker reflects on the swift turnover of AI models—from Lama 2 and Claude to Falcon and agents—criticizes the notion of an AI bubble or lack of a “killer” app, and emphasizes that AI already permeates countless applications. - [00:09:30](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=570s) **AI Energy Demand vs Supply Constraints** - The speakers compare the large power consumption of AI queries to traditional searches, debate whether energy availability will limit future AI deployment, and argue that algorithmic and hardware efficiencies will eventually alleviate these concerns while the real challenge becomes extracting value from the technology. - [00:12:37](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=757s) **Claude Engineer Opens New Coding Horizons** - A discussion on Pietro Schirano’s open‑source Claude Engineer tool, which lets developers access Claude 3.5 Sonnet from the command line and adds agent‑driven features that could expand AI coding assistance beyond simple autocompletion toward on‑demand, Stack‑Exchange‑style help. - [00:15:42](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=942s) **Assistive vs Agentic Coding Tools** - The speakers contrast Copilot’s line‑by‑line code suggestions with Claude Engineer’s agentic ability to scaffold entire applications and workflows. - [00:18:47](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=1127s) **Ground Innovation Drives AI Competition** - The speakers argue that real breakthroughs arise from on‑the‑ground engineers, noting a perceived shift toward Anthropic gaining an edge over OpenAI as developers build third‑party products on top of foundation models. - [00:21:49](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=1309s) **AI-Generated Music Copyright Lawsuit** - The speaker discusses the RIAA’s lawsuit over AI models training on copyrighted music, highlighting its significance as the first major case in the music arena and comparing it to evolving norms around ebook piracy. - [00:24:54](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=1494s) **Focusing on Musical Output Infringement** - The speakers explain how plaintiffs will target specific song elements such as chords and progressions, citing prior cases (e.g., Ed Sheeran, The Verve) to argue infringement, and argue that a fair‑use defense is unlikely to succeed. - [00:28:01](https://www.youtube.com/watch?v=Wlf6id2FH-Q&t=1681s) **Synthetic Music and Copyright Workarounds** - The speakers debate using AI‑generated music and similarity metrics to create legally distinct works, arguing that embedding spaces and fair‑use reasoning could sidestep infringement claims despite industry turmoil. ## Full Transcript
0:00Tim Hwang: Hello and happy Friday. 0:01You're listening to Mixture of Experts. 0:03I'm your host, Tim Huang. 0:05Each week, Mixture of Experts brings together a wide range of specialists to 0:08separate the AI signal from the AI noise. 0:11We tackle the biggest stories of the week and distill them down 0:13to just what you need to know. 0:16This week on the show, three top headlines. 0:18First, the bank's way in. 0:20Goldman Sachs is out with a harsh report on the future of generative 0:23AI, claiming the space still has a long way to go to prove its value. 0:26Are the bankers out of touch? 0:28Or do we think they've got some good points? 0:29Brent Smolinski: Feels like they went from one extreme to 0:32the other a little bit, right? 0:33Like, I almost feel like there's something in the middle. 0:36Tim Hwang: Second, AI developer Pietro Schirano is out with Cloud Engineer 2. 0:400, which adds a code editor and code execution agents to an already 0:43powerful command line interface tool. 0:46What does the next stage of coding assistance look like? 0:48And who's currently winning in the Anthropic Open AI matchup? 0:51Chris Hay: The interesting thing about Claude Engineer, it's really 0:53embraced the agent methodology. 0:55It's agentic, it is goal oriented. 0:57And I think that's where we really are going to be going as an industry. 1:01Third, the Recording 1:02Tim Hwang: Industry Association of America, or RIAA, has launched a 1:05lawsuit against generative AI music companies Suno and Udio, claiming 1:10mass copyright infringement. 1:11How might copyright shape the generative AI space, and what does it 1:14mean for the future of training data? 1:16Marina Danilevsky: They're not going to get their way, and at 1:17some point in time, they're going to have to learn to live with it. 1:27Tim Hwang: As always, I'm joined by an incredible group of panelists 1:29that will navigate what has been another action packed week in AI. 1:32Today, we've got Chris Hay, Distinguished Engineer, CTO, Customer Transformation, 1:37Marina Danilevski, Senior Research Scientist, and joining us for the very 1:40first time, Brent Smolenski, Global Head of Tech, Data and AI strategy. 1:44Thanks for having me. 1:49So first up, I want to change a little bit of what we do typically, 1:52and I want to ask you all a yes or no question, and then we're going to 1:55actually dive into the story of the week, which is the Goldman Sachs report. 1:59And the question is this, is AI a bubble? 2:02Chris? 2:04No, definitely no. 2:05Marina. 2:06Marina, what do you think? 2:07Marina Danilevsky: Yeah, I know generative AI a little bit. 2:09Tim Hwang: And Brent, what do you think? 2:11Yes, sort of. 2:14Well, with those extremely definitive answers, let's move 2:16on to our story for the week. 2:17Brent Smolinski: Listen, I think if you look back, You know, a little over a year 2:22ago, when Goldman Sachs first published their, uh, article on kind of the impact 2:27of that, that generative AI would have on the market, they predicted something 2:30like 7 percent GDP lift, uh, which is, just as context, that's the size of 2:35the North America healthcare market. 2:37That's a massive impact. 2:39I think now what we're starting to see is, uh, in their last publication. 2:42They're beginning to backpedal on that. 2:44Tim Hwang: Yeah, and I think it's a great intro. 2:45I think that's exactly what I wanted to talk about. 2:47I mean, just last week, or I think just a few weeks ago now, Goldman 2:51Sachs released its update, Brent, that you're kind of referring to, 2:54and they kind of fessed up, right? 2:56I think the end conclusion of that report is that the current state 2:59of generative AI is Quote, too much spend for too little benefits. 3:04Um, and this follows on the heels of some other cautious statements 3:06coming out of Sequoia, obviously a prominent VC fund, and McKinsey, which 3:10works with a huge number of companies and has been kind of like at the 3:12forefront of, I think, pushing sort of generative AI as an enterprise use case. 3:17And that's kind of where I want to start today is just to kind of like go into this 3:22kind of moment of, you know, hesitation. 3:23I mean, I look at the last 24 months have been crazy growth in 3:26generative AI and crazy excitement. 3:29Um, but I think now the industry is kind of like almost thinking a little bit 3:32about like, okay, so what happens next? 3:35And I guess Marina, maybe I'll throw it to you next. 3:37I mean, one of the numbers that was most striking to me that was 3:40kind of cited in the Goldman Sachs report was that, you know, they, they 3:43talked to Darren Acemoglu, who's this kind of prominent MIT economist who 3:47estimates only about 5 percent of tasks are really genuinely at risk. 3:52from what's been happening in generative AI. 3:54I guess, Mary, do you buy that as an estimate as someone who kind of works 3:57on the technical side of all this? 3:58Like, do you see capabilities, you know, really becoming much broader over time 4:02or, or really do you think this estimate is kind of inaccurate of like, you 4:05know, the kinds of tasks that are really going to be at risk in the economy? 4:09Marina Danilevsky: I think the state at which the tech is right now, I'm actually 4:12not that far off from what, uh, what he says as well, what Darren says, um, 4:18there's still a lot to be thought of as far as things that we could do with 4:21this technology, but it's very clear that we haven't quite thought of it yet. 4:25So when it comes to, you know, is it a bubble right now? 4:28Yeah, a little bit as far as the hype versus what the capabilities actually 4:31are, what the reliability actually is. 4:34I think we need to continue to think. 4:36Something that's always really interesting about core technological research is 4:39you don't know what the applications are sometimes until sometime later. 4:43So it's always very interesting to push those boundaries, but yeah, there's 4:47gotta be a difference between hype and actual usability, especially when 4:51it comes to things that are reliable. 4:53At the moment, it's good at, as an accelerant. 4:56It's good to speed up. 4:58People and tasks that they're kind of doing right now. 5:01All right, but that's not enough Does it 5:04Tim Hwang: like justify the valuation of NVIDIA as the most, you know, the 5:08most valuable company, um in the world? 5:10I mean, I guess chris, you know, I think I recall last few times 5:13you've been on the show You've always been our hard bitten cynic. 5:16Um, I don't know if you are Uh, uh, agreed with Brent and Marina here, or 5:20if you're more of a contrarian, like you actually feel like you're more optimistic 5:23than what these bankers are saying, because what do bankers know anyways? 5:26Chris Hay: I love the hype. 5:27We wouldn't have this podcast if there wasn't any 5:29Tim Hwang: hype. 5:30Chris Hay: So no, I enjoy this every few weeks, the hype has to stay. 5:38I read the report though, and I think, I can't remember who said 5:41it, but one of the guys said, uh, nothing's going to happen for the next. 5:4410 years. 5:45You know, this generative AI is a complete waste of time. 5:48And, and I was just thinking about like back to the 60s when someone said, you 5:51know, there's only a need for maybe five mainframes in the entire world. 5:56And I'm just like, Oh my goodness, I wouldn't want to 5:58be writing that on that report. 5:59I, I wouldn't want to be quoted in 10 years time being the guy that said 6:03generative AI was a waste of time. 6:05So, uh, now I think it is early. 6:09Obviously, um, but the technology is progressing so fast. 6:13I mean, I, I was talking to a customer yesterday and I brought up the models that 6:18were popular this time last year, right? 6:21And if you think about this time last year, right, Lama 2 just came out. 6:25There was no such model from Mistral. 6:28Right. 6:28The, the granite models were just out at this point. 6:32Claude two, it just came out, nevermind called three, five sonnet, right? 6:36There was no turbos and open AI. 6:38So, and we were talking about the Falcon models. 6:42We were talking about the vacuna. 6:44Nobody talks about these anymore. 6:46The, the, everything has moved so fast. 6:48And then this year we're like agents, agents, agents. 6:52So. 6:53If I look at the kind of time frame they're talking about, next three 6:56to five years, ten years, this industry is moving so fast and the 7:01capabilities are getting so much better. 7:03I, I, I'm happy for them to say it's a bubble because that's going 7:07to create more space for people to get on and do the work, so. 7:10Like more opportunity. 7:11Yeah, exactly, but this is not going away. 7:14That's for sure. 7:15Yeah, 7:15Brent Smolinski: I mean it feels like they went from one extreme 7:17to the other a little bit, right? 7:19Like, I almost feel like there's something in the middle. 7:22I think one of the analysts said that there's no killer AI, application of AI. 7:27I mean, that to me seems like an odd statement, right? 7:30Because, I mean, first of all, we AI permeates. 7:34Chat GPT. 7:34Yeah, I mean, well, I think that's what he meant, right? 7:37Like, I think what they're getting at is, is, is really, when they say AI, 7:40they meant large language models, right? 7:42But the reality is, is AI permeates, like, so much of our, uh, applications today. 7:48Uh, I, I mean, it's just, uh, I mean, even in this, uh, This session that 7:52we have right now, AI is being used to do signal processing, clean up 7:56the videos, and so on and so forth. 7:58So, I mean, AI permeates just about everything we, we interact, all 8:02applications we interact with today. 8:04So, so I don't quite get that statement. 8:06I think he, what he meant was, uh, large language models. 8:09Marina Danilevsky: I know I agree with you a lot, actually. 8:11AI has been around for a very long time and does a lot of 8:13interesting and good things. 8:15AI just means, hey, the computer's doing something useful. 8:17There's some kind of, you know, statistics processing happening. 8:20And generative AI is actually a relatively small part of that. 8:23Yeah. 8:24So it's not that fair to take AI and say, okay, this is the 8:27only AI that matters anymore. 8:28Yeah, it's the one we're paying attention to, but it's actually 8:31built on the shoulders of giants. 8:32in some way. 8:33There's been so much work going on for so many decades. 8:35Brent Smolinski: So the relentless march of AI progress, right? 8:38It's just, you know, the technology continues to evolve and continues to 8:42permeate our application landscape in ways people don't even realize. 8:48Tim Hwang: Yeah. 8:48And I think that is something that we like do forget quite a bit is 8:51that You know, for a long time it was like, eh, what's happening in NLP? 8:54Everything's about computer vision. 8:55That's the really exciting thing. 8:56Or like, everything's about reinforcement learning. 8:58That's really how we're going to get to, you know, next generation systems. 9:01And then kind of just like, everything sort of like flipped 9:03in a very unexpected way. 9:05It reminds me of this tweet that I saw that I was amazing. 9:07So there's this adage in financial markets, which is like, the 9:09market can be irrational longer than you can stay solvent. 9:12And the person's tweet was basically that like, um, you know, 9:15a sigmoid can stay exponential for longer than you can say solvent. 9:18So like, you know, which I think is just, you know, it's 9:21like, Beautiful in some ways. 9:23I think one thing I did want to touch on in the report is that it does 9:25focus on some interesting potential constraints on growth, which I 9:29think are really genuine, right? 9:30Like, I think we debate about how far the tech can go in the economy. 9:34But I think one of the most interesting stats they cited was the idea that, 9:38you know, per query, the power draw for something like OpenAI is 9:42like 10 times the power draw for something like a query like Google. 9:46And, you know, it is true that like Energy is becoming kind of a 9:49constraint on these things, right? 9:50Like, if you want to run mega, mega, mega clusters, it actually just 9:54turns out that, like, in the United States, there's actually, like, only 9:56a few places that have the physical plant that's necessary to do this. 10:00And I guess I'm kind of curious if you all sort of buy that, is that we, we may, 10:05you know, I kind of buy the argument that, like, well, are we demand constrained? 10:08It's kind of anyone's guess, but we may very well become supply constrained. 10:11Brent Smolinski: Yeah, I mean, listen, uh, these same kind of arguments were applied 10:15with cloud computing like 10 years ago. 10:17People were worried about power consumption, yet we're able to build out 10:21the infrastructure, solve these problems. 10:23power problems. 10:24I would argue even the algorithms underlying a lot of these models 10:28are improving and becoming much more efficient, which translates 10:30into computational efficiency, which translates into energy efficiency. 10:34So I, I think these problems will get solved, right? 10:37I think in my mind, the biggest problem to figure out is how do 10:42I get value out of this, right? 10:44Once we kind of begin cracking the kind of the AI, the value problem with 10:49a, you know, applying some of these, these large kind of transformer based 10:53architectures to real world business problems, I think that's going to 10:56unlock a floodgate of demand, right? 10:59Um, and then at that point, at that point, we can begin talking 11:02about the supply constraint. 11:03But right now, I think it's a second order problem to think about. 11:07And I'm. 11:08Very confident this problem will get solved. 11:10Tim Hwang: Marina, I'm curious if I could turn to you as just a kind of a 11:13question on the last thought on the story. 11:15I mean, is it right to say, maybe the right way of thinking about this, and 11:17I don't know if you agree with the statement, is that, you know, there 11:20may very well be a bubble in something like language models, but I think we 11:24should doubt whether or not there is a bubble in sort of AI writ large. 11:27I don't know if you'd agree with that as kind of a way of sort of framing up. 11:30You know, what's going on here? 11:31Marina Danilevsky: I don't think there's a bubble in AI. 11:32I think there's Ecclesiastes seasons. 11:35We have winters, we have summers, and it goes, and it goes. 11:37So right now, there's a lot of attention. 11:39But also, I'm like, all right, I was doing NLP before it was cool. 11:42I'm gonna be doing NLP after it stops being cool. 11:45Like, Those of us that are on the ground are just going to continue to push and 11:48that's where these things come from. 11:50Sometimes it becomes of interest to people, sometimes it doesn't. 11:53Um, to the thing that you had said before, you know, does it make 11:55sense to throw a large language model at every single query? 11:59Maybe not, but I think right now because the technology is early, everybody's 12:02just seeing, let's see what it can do. 12:03Let's test it as much as we can and it will eventually settle into, it's no 12:07longer a hammer in search of a nail, it'll settle into something that we're just 12:10as comfortable with as with search when it first, uh, started being a big thing 12:15and everybody's like, Oh, we're done. 12:16No more information organization necessary. 12:18We've solved it. 12:19No, but it's very, very useful nonetheless. 12:22So yeah, I think, I think we're in a season. 12:24The season will pass. 12:29Tim Hwang: So for our next segment, I think The thing I really wanted to 12:32focus on was that there was a cool little thing that was being passed 12:35around, uh, Twitter fairly recently. 12:37So Pietro Schirano, who is this AI engineer and kind of a serial 12:40entrepreneur based out in the Bay Area, um, just updated a project that 12:44he maintains, an open source project called Claude Engineer, um, and it's 12:48basically an open source project that allows coders to use Claude 3. 12:525 Sonnet, um, from the command line. 12:54And, you know, what I love about this project is there's a bunch of kind of 12:57creative features running under the hood. 12:59You know, he's playing around with agents and he's playing around with 13:02just like a bunch of these kind of like little quality of life improvements. 13:05And you know, again, this is not a big release from an anthropic or an 13:08open AI, the kinds of things that we've talked about in the past. 13:11But I do really think it's kind of interesting because, you know, I think 13:15we've been so locked into like co pilot as like thinking about how coding 13:19assistance works with generative AI. 13:22Right. Yeah. 13:22And I think what Claude Engineer playing around with is to say, well, actually, in 13:25the future, we might want to do more than just like predictive code, like kind of 13:28like stack exchange on demand, basically. 13:31Um, and so, you know, I guess, Chris, I see you nodding. 13:34Maybe I'll go to you first is, you know, as, as I'm wondering if you could 13:37explain to people who are listening for maybe non expert, not coding day in, day 13:41out, like what is the kind of promise? 13:44Do you see anything sort of interesting in what's happening with Claude Engineer? 13:47Like, what does the future of kind of coding assistance with AI look like? 13:51Um, and if there's like particular things you think are cool in Claude Engineer 13:53product or project or, or otherwise, just be kind of curious about how you think 13:57this kind of whole interface evolves. 14:00Chris Hay: Yeah, I think it's really interesting what he's 14:02done with Claude Engineer. 14:03It is so simple. 14:04It is literally just a command line application. 14:07You run it in the terminal in VS Code, so no extensions or anything like that. 14:11You put in your cloud key, and then it uses all of the tools 14:16that you would normally have with agents running in the background. 14:18So you give it a task, a goal, and then it can create folders on your machine. 14:24It can go and create entire files, and then it can stitch that all together 14:28to help you build entire applications. 14:30And when I think about this for a second, Copilot is very typically a 14:35kind of prescriptive model in the same way as we chat with our interfaces. 14:40The interesting thing about Claude Engineer, it's really embraced 14:42the agent methodology, it's agentic, it is goal oriented, and 14:46I think that's where we really are going to be going as an industry. 14:50So rather than me sitting there typing in a couple of letters, you know, waiting 14:54for copilot to come back for a response and then gives me a bit of a code 14:58segment, I don't like it, I delete it, and then I sit and pause again, you know. 15:02The co pilot pause is going to go away and we're going to give these 15:05agents goals and tasks and they're going to come back and help us build 15:09entire applications and, and, and really sort of start to orchestrate 15:13and, and build workflows there. 15:15And what's really going to happen, I love Claude Engineer, but I suspect co pilot 15:20is just going to steal all of that and build it into their extension anyway. 15:24Tim Hwang: Yeah, that's right. 15:25I mean, I think, yeah, that is one really interesting element of this is 15:28like how much projects like this can survive going forwards because they just 15:31get absorbed directly into the product. 15:33I guess, Chris, maybe if I can kind of turn the screw one more time, I think 15:36there's one sort of comment that you just had there about basically like Copilot 15:39being very prescriptive in nature. 15:42Um, and do you want to talk a little bit more about that? 15:44I guess what you're kind of saying is that like when you use Copilot, it literally 15:47recommends the code that you should be using as an engineer versus, I guess. 15:52Where you're contrasting here with, uh, Sharana's project is more just that, 15:55like, you're specifying more of an objective, and it's kind of like assisting 15:59you in getting to that objective. 16:00Is that, is that the distinction you're 16:02Chris Hay: assistive as opposed to agentic. 16:04So when I'm in Copilot, I will type a comment or the first couple of 16:09letters, and then I kind of wait. 16:11So, uh, It's not giving me an end to end goal. 16:14It's not building me an entire application. 16:17It's, it's really just a smart, um, uh, IntelliPrompt to, to be honest, right? 16:24So it's then just going to complete, uh, the piece of code that I'm writing. 16:27So maybe deal with that at a function level, it might deal with it at a line 16:30level, whereas in an agentic approach and with Claude Engineer, it's, yeah. 16:35Starting to scaffold entire applications and entire workflows and orchestration. 16:40And that's a completely different mindset from what Copilot has today. 16:44And I think that's the big shift that's happening. 16:47Um, again, it's really simple. 16:49It just runs on a command line. 16:50It's beautiful. 16:51Um, But I think a lot of people are going to riff off of that and we're going 16:54to get tons of tools and I'm excited. 16:56Tim Hwang: Marina, I'm curious if you have any thoughts kind of on where some 16:58of this goes and in particular I was sort of interested because you know what's 17:02cool about it is I think what Chris is kind of coming back to over and over 17:05again which is sort of like it's just in the command line right like it's 17:08almost like unfancy it's like and we don't have to make a big deal about the 17:11AI being part of your coding experience it's just like in the command line. 17:15But I'm kind of curious about like What else you think might be coming down the 17:18pike with this kind of project, uh, in particular, you know, I think one of 17:21the reasons I think we were excited to have you on the panel for this episode 17:25is like, you know, starting to combine stuff like, okay, well, we have agents, 17:28and then we've also got rag, and then we've got, you know, there's a bunch of 17:31things that you think can start to connect together, um, and, uh, and just kind of 17:36curious about how you think, like, you know, these types of patterns go going 17:39forwards for, for coding assistants. 17:41Marina Danilevsky: Two directions. 17:42One is, uh, by engineers for engineers, which this is a much more of a 17:46by engineers for engineers thing. 17:47Like why do we have IDEs? 17:49Why do we have more than one? 17:50A lot of these things really got created by people because they 17:53say, look, I know my workflow better than you know my workflow. 17:56I'm going to create tools that work for me. 17:58Other people are then going to be able to make use of it 18:00and say, yeah, that's great. 18:01People used to have the, you know, You know, Emacs versus Vim fight. 18:04Now we have, you know, Eclipse versus VS code versus whatever. 18:08But it really, most of those features do come from people saying, 18:12this is something that's helpful to me and I'm going to do it. 18:14So that's where this project sort of falls for me on the flip side. 18:17When you start to be able to combine things. 18:19We might finally have something interesting going on in the low 18:21code, no code space, which up to now has been like, isn't it great? 18:25We can, you know, arrange some visual blocks and you stick it together. 18:29And that's like programming. 18:30And you're Tim Hwang: like programming. 18:31Marina Danilevsky: You're programming now. 18:32No, you're not. 18:33Um, so we might actually be finally seeing something kind of interesting there. 18:37Although again, the persona is different, so you do have to design different things. 18:40But this goes to the fact that most of these things really come 18:42from, I think, individuals for even if Microsoft adopts it later. 18:47It's still the people on the ground that come up with the idea and go, 18:50Okay, this is what actually works, guys. 18:51Here, do it this way. 18:53That's where, you know, that kind of innovation comes, in my opinion. 18:57Tim Hwang: Yeah, for sure. 18:57Yeah, I'm looking forward to this gen creating, like, a new generation of 19:00endless nerd fights that are like Vim versus Emacs, but Dating myself, yeah. 19:07Yeah. 19:08Um, Brent, I guess I'm curious if you want to zoom up for us a little bit and 19:11kind of talk about this in the context of the broader competition, right? 19:15So I see this as kind of like, you know, at least for me, I think the 19:18vibe shift has been Anthropic is now ahead of OpenAI a little bit, right? 19:22Like they're, they're the cool cats. 19:24They're doing the really interesting things, but I think a big part 19:26of the battle is like, What we're seeing here with Claude Engineer right 19:30is like our third party engineers being like, this is so cool. 19:34I'm going to design my own third party product on top of like the services 19:38these foundation models are providing. 19:41And yeah, it's kind of curious about your take here about like this evolving 19:44competition between open AI and anthropic I guess ultimately for the 19:48hearts and minds of like engineers that are producing you know code out 19:51there in the world and and if you've got a feeling on like who's winning, 19:54who's advantaged, who's up, who's down. 19:56Brent Smolinski: Well, it certainly feels like things are shifting 19:58towards anthropic in, in cloud. 20:00That's for sure. 20:01I think a big part of it, I mean, is the economics, the cost effectiveness 20:06of these cloud models is, uh, there's, there's significantly more 20:10cost effective than, than open AI. 20:12And so with many of my clients, they're actually, um, moving 20:16away from, uh, open AI towards, towards cloud for that very reason. 20:20That's really interesting. 20:21All I do have to say is, is we recently did an, an engagement with, um, the 20:26senior executive team, uh, at one of our clients, and we developed, uh, the 20:30team developed this amazing prototype. 20:32It was an RFP, uh, generator, and they were able to develop this in, 20:37like, three weeks or I mean, some incredibly short period of time. 20:42There was a very powerful, almost, I mean, you can almost use it 20:44to generate these, these RFPs. 20:46There's a few tweaks you'd have to make at the edges. 20:48And I think everybody was, was blown away, um, uh, by how quickly they 20:53were able to Pull this application together and again a lot of it. 20:56They were they built this on cloud using a lot of these cogeneration tools as well 21:06Tim Hwang: Well, I'm gonna move us to the final segment of today and I apologize As 21:12a as a person who trained as an attorney, I'm always like watching the legal side 21:16of all this and so I was very Curious to see and get the opinions of the panel on a 21:22story that just happened a few weeks back. 21:24Um, the Recording Industry Association of America, or RIAA, is basically, 21:29um, sort of the music industry's representatives, lobbyists, 21:33advocates, um, in the United States. 21:35Um, and they launched a high profile lawsuit against two companies, uh, Suno 21:40and Udio, which are these two companies that are in the generative music space. 21:44So kind of the idea, if you've played around with a, uh, product 21:46like Suno, um, you download the app. 21:48You basically say, I want. 21:49a song that matches the following characteristics and it just generates the 21:52song and it's, it's actually quite good. 21:55Um, and presages this kind of really strange world where you're just like, 21:58you know, you like Taylor Swift. 21:59Cool. 21:59You can just get, you know, in a hundred hours, a thousand hours of Taylor Swift 22:04sounding noise basically going forwards. 22:07Um, and, uh, the RIAA sued both of these companies essentially claiming, 22:11uh, copyright infringement, right? 22:13And a big part of their claim leans on the fact that these 22:16companies are training on music. 22:19That is, that are ostensibly owned by rights holders. 22:22Um, and so we're about to see this big showdown. 22:24You know, similar versions of this lawsuit have popped up around OpenAI 22:27and Anthropic and other companies, but I think this is the first time we've seen 22:30a really high profile one happen around music, which I think is very interesting. 22:35Um, and I think the other thing that's very interesting to me is, um, you 22:37know, how it's going to evolve, right? 22:39Like, you know, for example, in the book space, right? 22:42For like Kindle, you know, like, I feel like there was a period of time where 22:45basically, you know, the kind of like piracy didn't really kind of take off. 22:48And so we have certain norms around ebooks that we don't 22:50have around, say, music, right? 22:53Um, and I think basically, You know, we're starting to see that 22:56evolution happen around different generative AI applications. 22:59Um, and I guess, Marina, I kind of want to toss it to you is like, as 23:02someone who's kind of a researcher in the space, training models in 23:06the space, you know, I think the big question for me is, you know, How do 23:10you think about these kinds of lawsuits? 23:11Right? 23:11Because I think there's one point of view, which is, well, look, if the 23:14RAAA gets its way, there's kind of no way to do these products just because 23:18of the sheer number of music files you need to put into these kinds of models 23:22to get them to have high performance. 23:24Um, do you think that's the case? 23:26Or am I kind of like overstating the risks here? 23:28Marina Danilevsky: Um, I mean, I think this, as usual, is going to revolve 23:32around discussions of fair use. 23:35And if you train on the music just as if you train on the text and then you 23:42throw it away and you just keep the features and the weights for the model, 23:45what if that counts, what if it doesn't? 23:47Um, but I, first of all, again, cat's out of the bag, people are 23:51going to do it anyway, so you got to figure out a way to do it. 23:53Um, second, this reminds me of, you know, discussions of, well, what 23:57if somebody posts something on the internet platform, I'm going to 24:00misremember what the legal thing is. 24:02It's the DM something or another of like, 24:04Tim Hwang: DMCA, yeah, the copyright, 24:06Marina Danilevsky: yes, of the like, you can't sue me just because somebody 24:09put something bad on my platform. 24:11I don't know. 24:11It just reminds me of the same kind of thing where people are going 24:13to continue to do the technology. 24:14You're going to have to find a way around it. 24:16The RIA is going to push for what they're going to push for. 24:20They're not going to get their way, and at some point in time, they're going to have 24:22to learn to live with it, because again, you can't stop people from, from doing it. 24:27Tim Hwang: Yeah, I mean, it does remind me a little bit of the early 2000s, right, 24:30where, you know, Napster came up, and file sharing became a thing, and the RIAA did 24:34the same thing, which is like high profile lawsuits against file sharers, and then, 24:39You know, I guess Marina's your point. 24:40It didn't really stop filesharing. 24:42But it also didn't break the music industry, right? 24:44Uh huh. 24:45That's right. 24:45Chris Hay: So I, I think the RIAA or whatever their acronym is, is 24:49gonna wipe apart Sona and UDU. 24:53I, they are gonna win this. 24:54So I, I spent this morning reading that. 24:56And they went after the angle that I thought they would go after, which 25:00is, Of course, they were going with the inputs, but, but if you look at 25:05the actual complaint, they focus in on the outputs, the individual songs. 25:09So they brought up, um, I think one of them was kind of 25:12Chuck Berry's Johnny be good. 25:14And then they brought up, um, some other one in one of the other 25:17complaints and they brought up the kind of the musical chords. 25:22And then they were like, this has this style and this has this style. 25:25These notes are identical and this is why they're going to win. 25:29And, and. 25:30The reason it's going to win is there's prior cases, uh, and Tim you can speak 25:34about this a lot more, where you've seen like the Ed Sheeran case where he had to 25:38prove that he didn't steal from this one, and then there was that one where, um, 25:43I think the Verve had stolen some lick from like the 1960s and they couldn't 25:47play their song forever long, and so there is prior on not being able to use 25:53outputs that have got similar chords, got similar musical progressions, yeah. 25:58And they're going to hit them with that and they're going to win and 26:01there's literally nothing they're going to be able to do about it. 26:04So even if you don't win on the, if you make the fair use argument that 26:09you make with books, that's not going to hold true on the outputs because 26:12you're just going to point to prior case law and say, well, actually, 26:16this was a copyright infringement. 26:17This was a copyright infringement. 26:19And then you're going to have to pay for all of those outputs. 26:21So what. 26:22What will probably happen with the generative AI there is they're 26:26going to have to then start to check the outputs of songs to see 26:30it doesn't infringe existing songs. 26:32So I think it's going to get super messy, but they're going to win, big star. 26:36Tim Hwang: Yeah, and I think that the messiness I think is really 26:39interesting because You know, so I used to do a bunch of work and still do 26:43around trust and safety on AI, right? 26:45And there you're trying to say like, well, we're going to use RLHF and 26:48we're going to create all these mechanisms to try to like constrain 26:50the behavior of the model, right? 26:52And a lot of what you learn is basically like anything you try to do to kind 26:55of like prevent model behavior or like block the impermissible behavior. 26:59There's lots and lots of ways of subverting, right? 27:01Particularly against sort of like a user that's adversarial. 27:05Um, and I think part of the worry that I have here is sure. 27:08You know, you're setting up a world where it's like, look, your model can 27:11output stuff that sounds almost exactly like this copyrighted training data. 27:15But then basically you're saying, okay, company, now you're 27:18responsible for preventing that. 27:19And I guess I would ask the question of like, is that actually possible? 27:22Like, I think from a technical standpoint, like we don't have a whole lot of examples 27:25of being able to kind of like really categorically block or in the very least, 27:28it kind of begs the question, when is an output so close to the training data that 27:32it really should be a copyright violation? 27:34I think that's kind of an open 27:35Chris Hay: question. 27:36I think it's I think it's a hard thing, right, because there's so much music. 27:41They don't, as far as, um, the record companies are concerned, 27:45it doesn't matter for them. 27:46They're just gonna sue each, anytime they find an infringement, they're 27:49just gonna sue the company, right? 27:50And it's gonna be so difficult that it's not gonna be worth anyone's while. 27:56Um, so, I, I said I think, I think it's going to be interesting and hard. 28:01I think this case is different. 28:03Maybe, maybe it turns out not to be the case and maybe we'll generate 28:07so much music and maybe synthetic data will actually be the solution 28:11to this because you just therefore invent a completely new style of 28:14music that isn't based on the past. 28:17And then, and then the outputs are not going to infringe. 28:20Maybe that's the solution. 28:22But Uh, it's definitely going to get messy. 28:24I, I just don't see these companies surviving it. 28:27Marina Danilevsky: Maybe not these two, but I think the actual 28:29technology is going to survive. 28:30They might kill these two, but I agree with what you said in the end, 28:33actually, Chris, which is, okay, so you find some way to figure out 28:36what is a sim dissimilar enough distance between music that it's okay. 28:41And you just do that by looking at all the music that's out there and saying, 28:44well, these two are, you know, this far apart, so you can't say anything. 28:47There's already been years and years and years of study for this. 28:50We've got Pandora and Spotify and how do we do radio and how do we do 28:53recommendations and all the rest of it. 28:55We've got an embedding space to work with. 28:57We've got things to do there. 28:58So they'll just keep pushing to the point of that. 29:01It's absurd to have the RA complain about a really specific thing. 29:04And. 29:05And that's what I'm gonna be 29:06Chris Hay: bought on Marina, right? 29:08That is the way around us, because then you get to make the fair use argument 29:11again, because you're saying, Well, I'm sampling these different cases. 29:14I'm not infringing anybody's copyright. 29:17So I totally agree. 29:19I think we're going to end up with a new style of music. 29:21And, and that will be the interesting thing. 29:24Brent Smolinski: Chris, you know, you bring up a very 29:26interesting, you know, argument. 29:28And, you know, the question I have is, is how creative can 29:31these platforms actually be? 29:33What do they give? 29:34It is what they create truly original or can it truly be original? 29:38Uh, and and then the other question I have too is, you know, this this 29:42platform Uh providers, uh, you know, they're just platform providers, right? 29:47And so is the question around, you know should these platform providers be held 29:51liable for for these, um, you know for for the content that's created or should 29:56be the People creating the content. 29:58Tim Hwang: Yeah, for sure. 29:59And I think we will see eventually stuff like, I mean, on YouTube right 30:02now, there's this really interesting thing, which is a form of content ID. 30:05So the idea is, well, if you want to use copyrighted music, 30:07you can have it in your video. 30:08And basically there's like a royalty that gets paid out if it's detected 30:11that you're using this kind of audio. 30:14And so there kind of could be really, these really interesting models 30:16that sort of emerge where it's like, well, you know, you're allowed to do, 30:20you know, a Katy Perry sound alike. 30:22Like she just gets some kind of paid off and then I mean, you know, Marina to your 30:26kind of comment, the really interesting question ends up being, well, how do 30:28you figure out the compensation based on its closeness in the embedding, right? 30:33Like, is this 10 percent Kanye, 30 percent Katy Perry, 10 percent Taylor's 30:37like, how would we actually go about like designing that kind of embedding 30:41space is going to be like a super, super interesting engineering problem. 30:44Um, so, uh, as usual, uh, we have more things to talk about 30:48than we have time to talk about. 30:50Um, but we are out of time for today, so, uh, Chris, Marina, Brent, 30:54thank you for coming on the show. 30:56Uh, as always, it's an awesome discussion, and we'll have to 30:58have you all back at some point. 31:00Thanks for joining us. 31:00If you enjoyed what you heard, you can get us on Apple Podcasts, Spotify, 31:04and podcast platforms everywhere, and we will see you all, uh, next week.