Listen "Cuttlefish Model Tuning"
Episode Synopsis
Hongyi Wang, a Senior Researcher at the Machine Learning Department at Carnegie Mellon University, joins us. His research is in the intersection of systems and machine learning. He discussed his research paper, Cuttlefish: Low-Rank Model Training without All the Tuning, on today's show. Hogyi started by sharing his thoughts on whether developers need to learn how to fine-tune models. He then spoke about the need to optimize the training of ML models, especially as these models grow bigger. He discussed how data centers have the hardware to train these large models but not the community. He then spoke about the Low-Rank Adaptation (LoRa) technique and where it is used. Hongyi discussed the Cuttlefish model and how it edges LoRa. He shared the use cases of Cattlefish and who should use it. Rounding up, he gave his advice on how people can get into the machine learning field. He also shared his future research ideas.
More episodes of the podcast Data Skeptic
Video Recommendations in Industry
26/12/2025
Eye Tracking in Recommender Systems
18/12/2025
Cracking the Cold Start Problem
08/12/2025
Shilling Attacks on Recommender Systems
05/11/2025
Music Playlist Recommendations
29/10/2025
Bypassing the Popularity Bias
15/10/2025
Sustainable Recommender Systems for Tourism
09/10/2025
Interpretable Real Estate Recommendations
22/09/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.