1 minute read

k-Nearest Neighbors Classifier

Discussion: 1) if for each criterion column Ci the maximum is the best, we add the coefficients for each row, and the biggest result is the best; 2) if for each criterion column Ci the minimum is the best, we similarly add the coefficients for each row, and the smallest is the best; 3) if for some criteria Ci the maximum is the best, for other criteria Cj the minimum is the best, and maybe for other criteria Ck another one is the best (i.e. neither the maximum, nor the minimum), then we use the

SIMPLIFIED TOPSIS: - we make per column the absolute difference between each component and the ideal one; - then we add the results per row; the smallest result is the best (being the closest to the ideal result).

Advertisement

k-Nearest Neighbors Classifier

To Yaman Akbulut, Abdulkadir Sengur, Yanhui Guo For the k-Nearest Neighbors Classifier, we’ll also explore the k-NN method where Dezert-Smarandache theory will be used to calculate the data samples’ memberships, replacing Dempster’s rule by Proportional Conflict

Redistribution Rule # 5 (PCR5) that is more performant in order to handle the assignments of the final class.

This article is from: