This paper is published in Volume-4, Issue-4, 2018
Area
Machine Learning
Author
Shreyal Gajare, Shilpa Sonawani
Org/Univ
Maharashtra Institute of Technology, Pune, Maharashtra, India
Pub. Date
20 July, 2018
Paper ID
V4I4-1283
Publisher
Keywords
Feature selection, Regression methods, Loss function

Citationsacebook

IEEE
Shreyal Gajare, Shilpa Sonawani. Improved sparse logistic regression for efficient feature selection, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.

APA
Shreyal Gajare, Shilpa Sonawani (2018). Improved sparse logistic regression for efficient feature selection. International Journal of Advance Research, Ideas and Innovations in Technology, 4(4) www.IJARIIT.com.

MLA
Shreyal Gajare, Shilpa Sonawani. "Improved sparse logistic regression for efficient feature selection." International Journal of Advance Research, Ideas and Innovations in Technology 4.4 (2018). www.IJARIIT.com.

Abstract

Variable and feature selection have become the focus of most of the research areas of application for which datasets with hundreds of thousands of parameters are available. These areas may include health risk prediction, text processing of internet documents, gene array analysis etc. Features gathered for analysis may not be completely informative; some may contain noise, some may require normalization or most of them can be irrelevant. The goal of feature selection is to improve predictions of predictors, provide faster & cost effective predictors and provide a better understanding of the underlying process and the data. In feature selection, sparse logistic regression method serves best for countably selecting the important attributes. The logistic loss function is included for sparsity purpose. The regularization parameter is responsible for controlling the sparsity factor. It provides the most effective and relevant variable selection for a specific model. Most of the applications where prediction is required, this method serves its best purpose.