Listen "689: Observing LLMs in Production to Automatically Catch Issues"
Episode Synopsis
Arize's Amber Roberts and Xander Song join Jon Krohn this week, sharing invaluable insights into ML Observability, drift detection, retraining strategies, and the crucial task of ensuring fairness and ethical considerations in AI development.This episode is brought to you by Posit, the open-source data science company, by AWS Inferentia, and by Anaconda, the world's most popular Python distribution. Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.In this episode you will learn:• What is ML Observability [05:07]• What is Drift [08:18]• The different kinds of model drift [15:31]• How frequently production models should be retrained? [25:15]• Arize's open-source product, Phoenix [30:49]• How ML Observability relates to discovering model biases [50:30]• Arize case studies [57:13]• What is a developer advocate [1:04:51]Additional materials: www.superdatascience.com/689
More episodes of the podcast Super Data Science: ML & AI Podcast with Jon Krohn
953: Beyond “Agent Washing”: AI Systems That Actually Deliver ROI, with Dell’s Global CTO John Roese
30/12/2025
952: How to Avoid Burnout and Get Promoted, with “The Fit Data Scientist” Penelope Lafeuille
26/12/2025
948: In Case You Missed It in November 2025
12/12/2025
946: How Robotaxis Are Transforming Cities
05/12/2025
945: AI is a Joke, with Joel Beasley
02/12/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.