Do you remember climbing trees in your childhood? According to researchers, at the University of North Florida, **climbing a tree can dramatically improve cognitive skills, including memory**. Climbing trees can help children become more flexible in body and mind, while also developing strong spatial reasoning skills.

As it turns out, “trees” can also help machines learn. As illustrated below, **decision trees** are a type of algorithm that use a tree-like system of conditional control statements to create the machine learning model; hence, its name.

Decision tree algorithm is one of the most popular machine learning algorithm. It is a supervised machine learning algorithm, used for both classification and regression task. It is a model that uses set of rules to classify something.

Decision tree based on the nested if-else classifier. it is the set of the axis-parallel hyperplane which divides the region into a hypercube.

The decision tree builds classification or regression models in the form of a tree structure. It breaks down a dataset into smaller subsets with an increase in depth of the tree. The final result is a tree with **decision nodes** and **leaf nodes**. A decision node (e.g., Outlook) has two or more branches (e.g., Sunny, Overcast and Rainy). Leaf node (e.g., Play) represents a classification or decision. The topmost decision node in a tree that corresponds to the best predictor is called the **root node**. Decision trees can handle both categorical and numerical data.

#data-science #regression #mathematics #data analysis

1.50 GEEK