Listen "[MINI] Bias Variance Tradeoff"
Episode Synopsis
A discussion of the expected number of cars at a stoplight frames today's discussion of the bias variance tradeoff. The central ideal of this concept relates to model complexity. A very simple model will likely generalize well from training to testing data, but will have a very high variance since it's simplicity can prevent it from capturing the relationship between the covariates and the output. As a model grows more and more complex, it may capture more of the underlying data but the risk that it overfits the training data and therefore does not generalize (is biased) increases. The tradeoff between minimizing variance and minimizing bias is an ongoing challenge for data scientists, and an important discussion for skeptics around how much we should trust models.
More episodes of the podcast Data Skeptic
Video Recommendations in Industry
26/12/2025
Eye Tracking in Recommender Systems
18/12/2025
Cracking the Cold Start Problem
08/12/2025
Shilling Attacks on Recommender Systems
05/11/2025
Music Playlist Recommendations
29/10/2025
Bypassing the Popularity Bias
15/10/2025
Sustainable Recommender Systems for Tourism
09/10/2025
Interpretable Real Estate Recommendations
22/09/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.