SKLearner Home | About | Contact | Examples

Scikit-Learn LassoLars Regression Model

Lasso Least Angle Regression (LassoLars) is a regression algorithm that combines L1 regularization with least angle regression. It is particularly useful for high-dimensional datasets where some features may be irrelevant.

The key hyperparameters of LassoLars include alpha (controls the strength of the L1 regularization), max_iter (maximum number of iterations), and eps (precision of the solution).

The algorithm is appropriate for regression tasks, especially those involving high-dimensional data.

from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LassoLars
from sklearn.metrics import mean_squared_error

# generate regression dataset
X, y = make_regression(n_samples=100, n_features=10, noise=0.1, random_state=1)

# split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=1)

# create model
model = LassoLars(alpha=0.1)

# fit model
model.fit(X_train, y_train)

# evaluate model
yhat = model.predict(X_test)
mse = mean_squared_error(y_test, yhat)
print('Mean Squared Error: %.3f' % mse)

# make a prediction
row = [[-0.80686251, -0.67440997, -1.01283112, 0.31424733, -0.90802408,
        -1.4123037,  1.46564877, -0.2257763,  0.0675282, -1.42474819]]
yhat = model.predict(row)
print('Predicted: %.3f' % yhat[0])

Running the example gives an output like:

Mean Squared Error: 0.183
Predicted: -246.341

The steps are as follows:

  1. First, a synthetic regression dataset is generated using the make_regression() function. This creates a dataset with a specified number of samples (n_samples), features (n_features), and added noise (noise). The dataset is split into training and test sets using train_test_split().

  2. Next, a LassoLars model is instantiated with alpha=0.1. The model is then fit on the training data using the fit() method.

  3. The performance of the model is evaluated by predicting the test set results (yhat) and comparing them to the actual values (y_test) using the mean squared error metric.

  4. A single prediction can be made by passing a new data sample to the predict() method.

This example demonstrates how to quickly set up and use a LassoLars model for regression tasks, showcasing its simplicity and effectiveness for high-dimensional data in scikit-learn.

The model can be fit directly on the training data without the need for scaling or normalization. Once fit, the model can be used to make predictions on new data, enabling its use in real-world regression problems.



See Also