Bagging

Bagging

Machine Learning Bytes

24/07/2019 12:00PM

Episode Synopsis "Bagging"

Bagging is an ensemble meta-algorithm. Basically, we take some number of estimators (usually dozens-ish), train them each on some random subset of the training data. Then, we average the predictions of each individual estimator in order to make the resulting prediction. While this reduces the variance of your predictions (indeed, that is the core purpose of bagging), it may come at the trade off of bias. For a more academic basis, see slide #13 of this lecture by Joëlle Pineau at McGill University.

Listen "Bagging"

More episodes of the podcast Machine Learning Bytes