Results of Ridge-regression

I've faced a problem connected with Ridge-regression.

As it's known, Ridge-regression is used in cases of strong conditionality of features. This is just my case: the determinant of my matrix of inter-factor correlation is of the order of 10^(-18). Multicollinearity is my case. The sampling of data consists of 8 quantitative features.

Ridge regression gives worse (or just the same) results than the standard linear regression.

What leads to this result? How can the results be improved?

1 answer

  • answered 2018-03-13 23:11 Mukund Jha

    Ridge regression has one obvious disadvantage. Unlike best subset, forward stepwise, and backward stepwise selection, which will generally select models that involve just subset of the variables, ridge regression will include all predictiors in the final model. The lasso is a relatively recent alternative to ridge regression that over-comes this disadvantage. However, have you already considered selecting the tuning parameter using cross-validation? reference: Chapter 6 Linear Model Selection and Regularization; ISLR