Listen "Transformers Mini Series: How do Transformers Process Text?"
Episode Synopsis
In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.
Connect with Emily Laird on LinkedIn
Connect with Emily Laird on LinkedIn
More episodes of the podcast Generative AI 101
Groq Star: Who is Jonathan Ross?
14/01/2026
How Nvidia Took Over the AI Game
12/01/2026
AI Workslop: When AI Takes Over the Office
16/12/2025
ChatGPT Turns 3: Rise of the Prompt People
09/12/2025
ChatGPT Turns 3: The Origin Story
08/12/2025
World Models vs LLMs
19/11/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.