Listen "TabPFN: A Revolution in AutoML?"
Episode Synopsis
Today we’re talking to Noah Hollmann and Samuel Muller about their paper on TabPFN - which is an incredible spin on AutoML based on Bayesian inference and transformers.[Quick note on audio quality]: Some of the tracks have not recorded perfectly but I felt that the content there was too important not to release. Sorry for any ear-strain!In the episode, we spend some time discussing posterior predictive probabilities before discussing how exactly they’ve pre-fitted their network, how they got their training data, what the network looks like, and how the system is performing.To give you a taste of it, on datasets up to 1,000 training instances and 100 features, it takes less than a second to train and predict a classifier!Read their paper here: https://arxiv.org/pdf/2207.01848.pdfFollow Samuel on Twitter, here: https://twitter.com/SamuelMullrFollow Noah on Twitter, here: https://twitter.com/noahholl
More episodes of the podcast The AutoML Podcast
Nyckel - Building an AutoML Startup
07/03/2025
X Hacking: The Threat of Misguided AutoML
27/05/2024
Introduction To New Co-Host, Theresa Eimer
26/05/2024
AutoGluon: The Story
05/09/2023
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.