SKLearner Home | About | Contact | Examples

Ridge

Helpful examples of using Ridge Regression machine learning algorithms in scikit-learn.

Ridge regression, also known as Tikhonov regularization, is a technique used to analyze multiple regression data that suffer from multicollinearity.

When independent variables are highly correlated, ordinary least squares (OLS) estimates can be unreliable. Ridge regression addresses this issue by introducing an L2 penalty to the loss function, which is the sum of the squares of the coefficients.

This penalty term shrinks the coefficients towards zero but not exactly to zero, thus reducing their variance. The degree of shrinkage is controlled by a hyperparameter (lambda), which balances the trade-off between fitting the data and keeping the model coefficients small.

Ridge regression is particularly effective for handling situations with many predictors or collinear data, resulting in more stable and interpretable models.

Unlike Lasso, Ridge regression does not perform variable selection, as it does not shrink coefficients to exactly zero, but it improves prediction accuracy and generalizability.

ExamplesTags
Configure Ridge "alpha" Parameter
Configure Ridge "copy_X" Parameter
Configure Ridge "fit_intercept" Parameter
Configure Ridge "max_iter" Parameter
Configure Ridge "positive" Parameter
Configure Ridge "random_state" Parameter
Configure Ridge "solver" Parameter
Configure Ridge "tol" Parameter
Scikit-Learn "Ridge" versus "RidgeCV"
Scikit-Learn "RidgeClassifier" versus "RidgeClassifierCV"
Scikit-Learn GridSearchCV Ridge
Scikit-Learn GridSearchCV RidgeClassifier
Scikit-Learn RandomizedSearchCV Ridge
Scikit-Learn RandomizedSearchCV RidgeClassifier
Scikit-Learn Ridge Regression Model
Scikit-Learn RidgeClassifier Model
Scikit-Learn RidgeClassifierCV Model
Scikit-Learn RidgeCV Regression Model