Listen "Generation AI Podcast Episode #3 Tri Dao"
Episode Synopsis
Tri Dao is known for his groundbreaking work on Flash Attention at Stanford, enabling a fast and efficient implementation of the Attention mechanism in Transformers, opening up possibilities for much longer sequence length in GPT-4, Claude Anthropic as well as in images, video and audio.
We sat down with Tri Dao to discuss the impact of his pioneering work on software/hardware co-design and some of the new innovation that's coming in the world of transformers and generative AI.
We sat down with Tri Dao to discuss the impact of his pioneering work on software/hardware co-design and some of the new innovation that's coming in the world of transformers and generative AI.
More episodes of the podcast Generation AI
Generation AI Podcast Episode #10 Ayush Garg
07/11/2024
Generation AI Podcast Episode #9 Tobias Lunt
26/08/2024
Generation AI Podcast Episode #5 Jordan Volz
20/12/2023
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.