IKOAfs: Multi-Objective KNN Classifier for Feature Selection Based on Improved Kepler Optimization Algorithm
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
To improve classification performance, feature selection is usually employed to reduce data dimensionality, eliminate irrelevant features, and decrease computational cost. Among various classifiers, the K-Nearest Neighbors (KNN) algorithm is widely used because of its simplicity and robustness. However, KNN performance is highly sensitive to interdependent parameters. Simultaneously optimizing these parameters is challenging due to the high-dimensional search space. A multi-objective feature selection method is proposed to address these challenges, which is based on an improved Kepler Optimization Algorithm (IKOAfs). The proposed algorithm employs tent mapping, Lévy flight, and self-perturbation to improve search performance, and uses a multi-objective fitness function to jointly optimize classification accuracy, feature selection, and KNN parameters. The reliability of IKOAfs was evaluated in CEC2022 and eight representative public benchmark datasets. The results demonstrate that IKOAfs achieves superior performance compared to six well-known swarm intelligence algorithms in terms of classification accuracy, feature reduction rate, and generalization stability, particularly on high-dimensional datasets.