What is Boosting?

Boosting is a very popular ensemble technique in which we combine many weak learners to transform them into a strong learner. Boosting is a sequential operation in which we build weak learners in series which are dependent on each other in a progressive manner i.e weak learner m depends on the output of weak learner m-1. The weak learners used in boosting have high bias and low variance. In nutshell boosting can be explained as boosting = weak learners + additive combing.

Gradient boosting Algorithm

Gradient boosting is a machine learning technique for regression and classification problems, which produces a prediction model in the form of an ensemble of weak prediction models, typically decision trees (Wikipedia definition

Algorithm steps:

Image for post

#data-science #gradient-boosting #ensemble-learning #boosting #machine-learning

Understanding gradient boosting from scratch with a small dataset
6.50 GEEK