Research Paper
Importance of Feature Selection in Model Accuracy
As a dimensionality reduction strategy, feature selection attempts to select small set of most important features from primary features by eliminating obsolete, data-redundant and non-relevant noisy features. This process of choosing a set of the original variables such that a model based on data containing only these features has the simplest output is known as feature selection. Feature Selection eliminates over-fitting, increases model efficiency by removing redundant functions, and has the added benefit of maintaining the primary feature representation, resulting in improved accuracy. Good learning efficiency, results into higher machine learning model accuracy, lower cost of computation, and efficient model accuracy, is typically the product of feature selection. Recently, researchers in the area of computer vision, deep Learning, data mining, and other fields have shown that several feature selection algorithms resulted in the efficiency in their work through computational theory and research. This paper aims to examine the importance of feature selection in model accuracy. Feature selection is critical for various reasons, which include simplicity, performance, computational efficiency, and accuracy. It is often used in both supervised and unsupervised learning scenarios. These strategies can help boosting the productivity of various machine learning algorithms, as well as coaching. Feature selection decreases learning time and increases data consistency and comprehension.
Published by: Pranjal Rawat, Nitin, Sameer Dev Sharma
Author: Pranjal Rawat
Paper ID: V7I4-1470
Paper Status: published
Published: July 22, 2021
Full Details