An Exhaustive Wrapper Method for Feature Selection in Large Dimensional Datasets (WFS)

Main Article Content

Damodar Patel, Amit Kumar Saxena, Suman Laha, Rajeshwar Prasad, Utpal Roy

Abstract

In this paper, a novel algorithm for randomly selecting a small subset of features from a dataset is presented. Using different combinations of features across a number of trials, the algorithm discovers the best subsets of features. When these subsets of features are obtained, the classification accuracies produced by three classifiers (K-Nearest Neighbor, Support Vector Machines, and Random Forest) are considered to evaluate the performance criterion of the proposed wrapper-based method. Further, to improve the classification accuracy and reduce the cardinality of the selected feature sets, an exhaustive feature selection method (the wrapper method) is used. The proposed algorithm is simulated on eighteen datasets, and the results are compared with those reported using nine comparable algorithms using three classifiers to justify the performance of the proposed algorithm. The average classification accuracies of eighteen datasets achieved are 88.66% in K-NN, 89.88% in SVM, and 89.14% in RF classifier with at most 10 features. The proposed algorithm archives better CA compared to nine comparable algorithms and the results of the experiments prove the proposed algorithm's performance is better in selecting the most effective features compared to other algorithms.

Article Details

Section
Articles