Listen "Episode 40: DeepSeek facts vs hype, model distillation, and open source competition"
Episode Synopsis
Let’s bust some early myths about DeepSeek. In episode 40 of Mixture of Experts, join host Tim Hwang along with experts Aaron Baughman, Chris Hay and Kate Soule. Last week, we covered the release of DeepSeek-R1; now that the entire world is up to speed, let’s separate the facts from the hype. Next, what is model distillation and why does it matter for competition in AI? Finally, Sam Altman among other tech CEOs shared his response to DeepSeek. Will R1 radically change the open-source strategy of other tech giants? Find out all this and more on Mixture of Experts. 00:01 – Intro 00:41 – DeepSeek facts vs hype 21:00 – Model distillation 31:21 – Open source and OpenAI The opinions expressed in this podcast are solely those of the participants and do not necessarily reflect the views of IBM or any other organization or entity.
More episodes of the podcast Mixture of Experts
Claude 4: Everything you need to know
23/05/2025
Introducing Mixture of Experts Podcast
07/06/2024
1X NEO humanoid robot enters the home
07/11/2025
Anthropic’s TPU move and NVIDIA’s Starcloud
31/10/2025
ChatGPT Atlas, OpenAI’s new web browser
24/10/2025
OpenAI, Oracle & AMD shake up AI
17/10/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.