Nvidia Talks With Biden ☎️ // Beyond Transformers 🚀 // Federated Billion-Sized LLMs 💻

13/12/2023 14 min

Listen "Nvidia Talks With Biden ☎️ // Beyond Transformers 🚀 // Federated Billion-Sized LLMs 💻"

Episode Synopsis

Nvidia is in talks with the Biden administration about permissible sales of AI chips to China, while also facing challenges with ChatGPT-4's performance.
The open-source model, StripedHyena-7B, offers a potential solution for improved training and inference performance over the Transformer architecture.
The papers explore efficient quantization strategies for Latent Diffusion Models, the push for transparency and collaboration in the development of LLMs, and a novel approach for federated full-parameter tuning of billion-sized LLMs.
The episode covers a range of AI topics, from industry news to cutting-edge research, and offers insights into the challenges and potential solutions in the field.
Contact:  [email protected]
Timestamps:
00:34 Introduction
01:25 Nvidia in talks with Biden administration about AI chip sales to China, US commerce chief Gina Raimondo says
02:50 As ChatGPT gets “lazy,” people test “winter break hypothesis” as the cause
05:04 Paving the way to efficient architectures: StripedHyena-7B, open source models offering a glimpse into a world beyond Transformers
06:20 Fake sponsor
08:07 Efficient Quantization Strategies for Latent Diffusion Models
09:42 LLM360: Towards Fully Transparent Open-Source LLMs
11:25 Federated Full-Parameter Tuning of Billion-Sized Language Models with Communication Cost under 18 Kilobytes
13:20 Outro

More episodes of the podcast GPT Reviews