1 minute read
Naive Bayes
e-ISSN: 2582-5208 International Research Journal of Modernization in Engineering Technology and Science
Volume:03/Issue:03/March-2021 Impact Factor- 5.354 www.irjmets.com
Advertisement
The confusion Matrix of the dataset obtained from the AdaBoost is shown below which provides all the necessary measures of the choice:
7.0 Logistic Regression
Logistic regression is a popular Machine learning algorithms that come under supervised learning techniques. It can be used for Classification and Regression problems but mainly used for Classification problems. It is used to predict the categorical variable with the help of independent variables. The output of the Logistic Regression problem can be only between 0 and 1. It can be used where the probabilities between the two classes are required. It is based on the concept of Maximum Likelihood estimation. According to this estimation, the observed data should be most probable. In logistic regression, we pass the weighted sum of inputs through an activation function that can map values between 0 and 1. Such activation function is known as the sigmoid function, and the curve obtainedis called a sigmoid curve or S-curve.
Figure 7: Logistic Regression The confusion Matrix of the dataset obtained from the Logistic Regression is shown below which provides all the necessary measures of the choice:
8.0 Naive Bayes
It is the simplest machine learning algorithm based on Bayes’ Theorem with an independence assumption among predictors. It means a Naive Bayes classifier presumes that the presence of a particular feature in a class has nothing to do with the presence of any other feature. Its model is easy to build and particularly useful for massive data sets. Along with simplicity, it is famous for outperforming highly complex classification problems. Also, it provides a way of calculating posterior probability P(c|x) from P(c), P(x) and P(x|c).