A decision tree (DT) is a class of machine learning algorithms, which is used in different tasks such as classification and regression. also, this kind of tree is utilized in data mining and business intelligence. Different decision tree algorithms exist, the most well-known of which are ID3, C4.5, C50, and CART.
The word "tree" refers to the graphical form that this decision-making tool takes. It's presented in the form of a connexed, non-oriented graph. This technique is used in supervised learning or tree building based on existing data.
Examples
- The decision trees used in the regression are used to find a quantitative value. For example, a tree that can predict house prices.
- The classification decision tree's goal is to predict a class to which the output variable belongs.
Decision Tree Terminologies
- The root node is where the decision tree begins. It will then be split into homogenous nodes. To optimize the tree, this node must be carefully picked.
- Leaf Node: Leaf nodes are the last nodes, and after getting a leaf node, the tree cannot be split any further. In the case of classification, the classes are represented by the leaf node.
- Child node: indicates the other nodes.
- Splitting: This is the process of splitting nodes into sub-nodes based on certain conditions.
- Branch: is a sub-tree that is part of the tree.
Why do we use a decision tree algorithm?
Example :
The first step in creating a decision tree is to select the appropriate characteristic to serve as the root node.
CODE: Decision Tree Implementation Using Python
We'll utilize the Sklearn package in this example.
from sklearn import tree
Conclusion
Decision Tree is a predictive model which is used in different fields like data mining, machine learning, and statistics. It is a flowchart in tree form that consists of nodes, branch, and leaf nodes (the final output).
0 Comments: