Listen "Explainable AI that is accessible for all humans"
Episode Synopsis
We are seeing an explosion of AI apps that are (at their core) a thin UI on top of calls to OpenAI generative models. What risks are associated with this sort of approach to AI integration, and is explainability and accountability something that can be achieved in chat-based assistants?Beth Rudden of Bast.ai has been thinking about this topic for some time and has developed an ontological approach to creating conversational AI. We hear more about that approach and related work in this episode.Join the discussionChangelog++ members support our work, get closer to the metal, and make the ads disappear. Join today!Sponsors:Fly.io – The home of Changelog.com — Deploy your apps and databases close to your users. In minutes you can run your Ruby, Go, Node, Deno, Python, or Elixir app (and databases!) all over the world. No ops required. Learn more at fly.io/changelog and check out the speedrun in their docs. Fastly – Our bandwidth partner. Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at fastly.comTypesense – Lightning fast, globally distributed Search-as-a-Service that runs in memory. You literally can’t get any faster! Featuring:Beth Rudden – LinkedIn, XChris Benson – Website, GitHub, LinkedIn, XDaniel Whitenack – Website, GitHub, XShow Notes:Bast.aiSomething missing or broken? PRs welcome!
More episodes of the podcast Practical AI
The AI engineer skills gap
10/12/2025
Technical advances in document understanding
02/12/2025
Beyond note-taking with Fireflies
19/11/2025
Autonomous Vehicle Research at Waymo
13/11/2025
Are we in an AI bubble?
10/11/2025
While loops with tool calls
30/10/2025
Tiny Recursive Networks
24/10/2025
Dealing with increasingly complicated agents
16/10/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.