Learning Library

← Back to Library

Mayo Clinic AI: Imaging, Genomics, Memory

Key Points

  • Mayo Clinic announced two AI initiatives: an automated radiology workflow that generates reports, assists with tube/line placement, and detects changes in chest X‑rays, moving from anecdotal success to a production system.
  • In partnership with Azure, Mayo is creating a reference human‑genome dataset by combining its exome data with large‑scale genome data, aiming to use AI‑driven models to accelerate personalized‑medicine analysis.
  • Chinese AI developer Huo introduced the Minimax model with a 4‑million‑token context window and “perfect recall,” marking a new generation of LLMs capable of handling extremely long inputs.
  • OpenAI’s ChatGPT beta now offers an “Advanced memory” feature, further enhancing the model’s ability to retain and recall information across interactions.
  • These developments highlight a broader trend toward extending context windows and improving memory in large language models, enabling more sophisticated and high‑stakes applications such as medical imaging and genomics.

Full Transcript

# Mayo Clinic AI: Imaging, Genomics, Memory **Source:** [https://www.youtube.com/watch?v=ElQ7deX014I](https://www.youtube.com/watch?v=ElQ7deX014I) **Duration:** 00:05:17 ## Summary - Mayo Clinic announced two AI initiatives: an automated radiology workflow that generates reports, assists with tube/line placement, and detects changes in chest X‑rays, moving from anecdotal success to a production system. - In partnership with Azure, Mayo is creating a reference human‑genome dataset by combining its exome data with large‑scale genome data, aiming to use AI‑driven models to accelerate personalized‑medicine analysis. - Chinese AI developer Huo introduced the Minimax model with a 4‑million‑token context window and “perfect recall,” marking a new generation of LLMs capable of handling extremely long inputs. - OpenAI’s ChatGPT beta now offers an “Advanced memory” feature, further enhancing the model’s ability to retain and recall information across interactions. - These developments highlight a broader trend toward extending context windows and improving memory in large language models, enabling more sophisticated and high‑stakes applications such as medical imaging and genomics. ## Sections - [00:00:00](https://www.youtube.com/watch?v=ElQ7deX014I&t=0s) **Untitled Section** - ## Full Transcript
0:00AI news today we've got a bunch of fun 0:02ones for you it's just never old uh I 0:05like to talk about actual applications 0:07of AI and I think it's really 0:08interesting to see them in a medical 0:09setting because those are really high 0:11stakes you have to get them right Mayo 0:13Clinic has published two different use 0:16cases for AI recently that I wanted to 0:18call out one is developing an AI model 0:21for automated Radiology workflows which 0:24I'm not super surprised by because 0:27there's been a lot of anecdotal 0:29reporting of how good large language 0:31models are at reading x-ray images and 0:35so what they've done is they've used it 0:37to uh build report generation into the 0:40X-ray workflow they're using it for tube 0:43and line placement for evaluation which 0:44is super interesting um and they're 0:46looking at change detection in chest 0:48x-rays which fits right in with the 0:51anecdotes but there's a difference here 0:53right like it's one thing to have like 0:54some bro on X say you know I got this 0:58x-ray looked at and look at what grock 1:00did or look at what Chad GPT did it's 1:02another to have Mayo clinics say this is 1:05good enough that we're actually going to 1:06build it in and so I think that's a 1:09moment the other thing they're working 1:11on is they're working with Azure on 1:13combining Human Genome data with Mayo's 1:16exome data sets and they're basically 1:18looking at how can they build a 1:19reference sort of perfect Human Genome 1:22and I know that like that sounds like 1:24the start of a sci-fi movie so we're 1:26just going to pass over that part um but 1:29they're looking to build build 1:30essentially a reference data set around 1:32the human genome with Azure that they 1:33can then use when they look at genome 1:37variance to build personalized medicine 1:40that when the details are somewhat 1:41sketchy on I could not tell you exactly 1:44how they are using AI to scale 1:48personalization except that in general 1:51the compute for personalization has been 1:53really difficult to do and AI is often 1:57able to 1:59figure 2:01out personalized relationships faster 2:04because a lot of what underneath the 2:06hood is going on as Transformers are are 2:08running these comparisons between tokens 2:11and building token 2:13relationships might be what's happening 2:15that's speculation uh I'll be curious to 2:17learn more about what they're doing with 2:19their genomic Foundation 2:22model number three huo is doing uh a 2:27model huo is a Chinese model maker 2:29they're they introduced their Minimax 2:31model it's a 4 million token context 2:35window so it's quite large and it has 2:37perfect recall this is now the second 2:40long context window perfect recall 2:43feature that I have seen in the last two 2:45days and there's a third one that I'm 2:48going to mention now which is the Chad 2:49GPT in beta for some users is now 2:52rolling out Advanced um how do I want to 2:57put it Advanced memory 3:00it's not clear if Advanced memory means 3:02it remembers better or if it remembers 3:05more but it's happening and so to me 3:08like when I look at these different 3:09bullet points the Hua Minimax model that 3:11came out the work that's been done on 3:13Titans at Google and now this release by 3:17Chad 3:18GPT I think that we're seeing the dotted 3:22lines toward one of the themes for 2025 3:24which is solving the memory problem um I 3:27would expect a lot more releases along 3:29those lines in the next couple of 3:31months number four uh you probably know 3:35this uh but I think it's worth calling 3:37out because it's another actual launch 3:38at scale Reddit has gotten llm search to 3:42be good 3:43enough that they feel good about 3:45launching Reddit answers which is an AI 3:47powered Search tool it provides curated 3:49conversational insights from reddit's 3:50discussions what's interesting to me 3:53is 3:56Reddit chose to take their time with 3:58this one like they could have been like 4:00Google Google rushed their uh AI 4:03summaries to Market they got panned for 4:05it Reddit took their time and it makes 4:08me wonder if the Reddit AI answers is 4:11actually going to be a higher quality 4:12experience for users versus Google 4:15because they took the time on the 4:16quality so I'd be curious for your 4:18thoughts there last but not least we 4:21have what I would call the rumor mill um 4:24so there's three things that are being 4:26rumored to be produced by open a or 4:28shipped by openai by by January 30th one 4:31is the 03 model which they've announced 4:33but not released one is the ever elusive 4:37project oion no one really knows what 4:39that is some people thinks that think 4:41that's GPT 5 which would be a new 4:43pre-trained class of language model 4:45bigger than 4:46gp4 uh but that's speculation at this 4:49point um and then the third thing that 4:51people think they're going to release is 4:53something farther on agents so not just 4:55scheduled tasks but um an entire class 4:59of Agents that we think they will call 5:01operators it's speculation we're not 5:04sure of the date it's rumor that's why I 5:06stuck it at the end but it's worth 5:08keeping an eye on as we get close to the 5:09end of January all right that's what we 5:12got for AI news let me know what you 5:14think cheers