Results of Ridgeregression
I've faced a problem connected with Ridgeregression.
As it's known, Ridgeregression is used in cases of strong conditionality of features. This is just my case: the determinant of my matrix of interfactor correlation is of the order of 10^(18)
. Multicollinearity is my case. The sampling of data consists of 8 quantitative features.
Ridge regression gives worse (or just the same) results than the standard linear regression.
What leads to this result? How can the results be improved?
1 answer

Ridge regression has one obvious disadvantage. Unlike best subset, forward stepwise, and backward stepwise selection, which will generally select models that involve just subset of the variables, ridge regression will include all predictiors in the final model. The lasso is a relatively recent alternative to ridge regression that overcomes this disadvantage. However, have you already considered selecting the tuning parameter using crossvalidation? reference: Chapter 6 Linear Model Selection and Regularization; ISLR