GBoost or extreme gradient boosting is one of the well-known gradient boosting techniques(ensemble) having enhanced performance and speed in tree-based (sequential decision trees) machine learning algorithms. XGBoost was created by Tianqi Chen and initially maintained by the Distributed (Deep) Machine Learning Community (DMLC) group. It is the most common algorithm used for applied machine learning in competitions and has gained popularity through winning solutions in structured and tabular data. It is open-source software. Earlier only python and R packages were built for XGBoost but now it has extended to Java, Scala, Julia and other languages as well.

In this article, I’ll be discussing how XGBoost works internally to make decision trees and deduce predictions.

To understand XGboost first, a clear understanding of decision trees and ensemble learning algorithms is needed.

#boosting algorithm #machine learning algorithms #xgboost #machine-learning

Understanding XGBoost Algorithm In Detail
6.30 GEEK