Online ISSN: 2515-8260

Assessing the Relative Importance of Predictors in Linear Regression

Main Article Content

Srinivasa Rao. D1 , S Jyothi Kannipamula2

Abstract

Abstract Regression is the most extensively used statistical technique for explaining theoretical relationships and for prediction. This method can be viewed as a mapping from input or response variables space to an outcome variable space. If the assumption of the model is met, metrics like R2 F statistic and significance of t-values of the regression coefficients are used to judge the goodness of fit of the regression model. Similarly Mean Square Error (MSE) is used to judge the predictive power of the regression model. For judging the relative importance of the response variables in an estimated regression model, the magnitude and signs of the regression coefficients are considered. However, this approach is quite arbitrary and many a times inconclusive. In this context the present paper demonstrates the use of some of the relative importance metrics (lmg (Lindemann, Merenda and Gold,1980, pmvd (Feldman,2005)) which provides the decomposition of variance explained by a regression model into nonnegative components. It is shown that these relative measures are comparatively better than the magnitude and sign of regression parameters for assessing the relative importance of individual predictors in regression.

Article Details