A threefold-ensemble k-nearest neighbor algorithm

Document Type

Article

Source of Publication

International Journal of Computers and Applications

Publication Date

1-1-2025

Abstract

Implementing the k-nearest neighbor classifiers is both conceptually and practically easy. However, kNN performance still suffers from the sensitivity to the selection of k-value and the incapability to generate competitive results over the unbalanced datasets. We therefore offer an ensemble-based, effective k-nearest neighbor classifier because of the ensemble learning’s resilience to class imbalance. The general KNN, weighted KNN, and local mean KNN are meticulously merged into a single ensemble KNN classifier in order to incorporate the ensemble weights from these classifiers. Given that these three kNNs are combined to create the final model, this kNN is called the Threefold-Ensemble K-Nearest Neighbor (TEkNN). The effectiveness of the proposed kNN model has been comprehensively assessed against five state-of-the-art kNN models and four machine learning models over 14 datasets from the University of California using the evaluation metrics (accuracy, F1, ROC, and MAE). The results illustrate that the TEkNN is a promising classifier across all evaluation metrics, attesting to its usability with high precision in other domains in which class imbalance is dominantly inherent.

ISSN

1206-212X

Publisher

Informa UK Limited

Volume

47

Issue

1

First Page

70

Last Page

83

Disciplines

Computer Sciences

Keywords

data classification, Ensemble kNN, kNN algorithm, local mean, weighted kNN

Scopus ID

85213946460

Indexed in Scopus

yes

Open Access

no

Share

COinS