Skip to content

Gradient Boosting and XGBoost🔗

In this class, you will learn to use the XGBoost library, which efficiently implements gradient boosting algorithms.

This Practice Course is composed of 3 parts - each part is meant to be done in about 1 hour : * In the first notebook, you will learn the basic of XGBoost, how to apply it on a dataset and tune it to obtain the best performances. * In the second notebook, we will focus on ensemble methods and explain what makes XGBoost different from other models. * Finally in the last notebook you will see how the choice of hyperparameters is a key element of a tradeoff between Bias and Variance.

Notebook 1: Introduction to XGBoost

Notebook 2: XGBoost and ensemble models

Notebook 3: Regularization

References🔗

XGBoost