1 minute read

Logistic Regression

Next Article
Naive Bayes

Naive Bayes

e-ISSN: 2582-5208 International Research Journal of Modernization in Engineering Technology and Science

Volume:03/Issue:03/March-2021 Impact Factor- 5.354 www.irjmets.com

Advertisement

Figure 4: K-Nearest Neighbor The confusion Matrix of the dataset obtained from the K-Nearest Neighborsis shown below which provides all the necessary measures of the choice:

5.0 Extra Trees Classifier

Extra Trees Classifier is another kind of ensemble learning technique that combine the results of multiple decorrelated trees collected in a forest to output its result. It is very similar to a Random Forest algorithm but only differs from constructing a vast number of decision trees in the forest. Each Tree in the Extra Trees Forest is built from the original training instances. Then, at each test node, each decision tree is provided with a random piece of 'k features' from the feature-set. Each decision tree must select the best available feature to split the data based on some criteria (e.g. Gini Index). This arbitrary sample of features leads to the creation of multiple de-correlated decision trees.

e-ISSN: 2582-5208 International Research Journal of Modernization in Engineering Technology and Science

Volume:03/Issue:03/March-2021 Impact Factor- 5.354 www.irjmets.com

The confusion Matrix of the dataset obtained from the K-Nearest Neighborsis shown below which provides all the necessary measures of the choice:

6.0 AdaBoost Classifier

Ada-boost, known as Adaptive Boosting, is one ensemble boosting classifier. It combines multiple classifiers to increase the accuracy of classifiers. It is an iterative ensemble method. AdaBoost classifier builds a robust classifier by combining multiple poorly performing classifiers to get high accuracy robust classifier. The basic concept behind Adaptive Boosting eka AdaBoost is to set the weights of classifiers and to train the data sample in each iteration such that it ensures accurate predictions of unusual or unseen observations.

Figure 6: AdaBoost Classifier

This article is from: