Least Angle Regression (LARS) is an iterative algorithm used for high-dimensional regression problems, particularly useful when the number of predictors is much larger than the number of observations.
The key hyperparameters of Lars
include n_nonzero_coefs
(target number of non-zero coefficients) and eps
(precision of the solution).
The algorithm is appropriate for regression problems, especially when dealing with high-dimensional data.
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.linear_model import Lars
from sklearn.metrics import mean_squared_error
# generate regression dataset
X, y = make_regression(n_samples=100, n_features=10, noise=0.1, random_state=1)
# split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=1)
# create model
model = Lars(n_nonzero_coefs=5)
# fit model
model.fit(X_train, y_train)
# evaluate model
yhat = model.predict(X_test)
mse = mean_squared_error(y_test, yhat)
print('Mean Squared Error: %.3f' % mse)
# make a prediction
row = [[0.5, -0.2, 0.3, 0.1, 0.4, -0.6, 0.2, 0.3, 0.8, -0.5]]
yhat = model.predict(row)
print('Predicted: %.3f' % yhat[0])
Running the example gives an output like:
Mean Squared Error: 5715.664
Predicted: 10.624
The steps are as follows:
First, a synthetic regression dataset is generated using the
make_regression()
function. This creates a dataset with a specified number of samples (n_samples
), features (n_features
), noise level (noise
), and a fixed random seed (random_state
) for reproducibility. The dataset is split into training and test sets usingtrain_test_split()
.Next, a
Lars
model is instantiated with the hyperparametern_nonzero_coefs
set to 5. The model is then fit on the training data using thefit()
method.The performance of the model is evaluated by comparing the predictions (
yhat
) to the actual values (y_test
) using the mean squared error metric.A single prediction can be made by passing a new data sample to the
predict()
method.
This example demonstrates how to quickly set up and use a Lars
model for regression tasks, showcasing the simplicity and effectiveness of this algorithm in scikit-learn.
The model can be fit directly on the training data without the need for scaling or normalization. Once fit, the model can be used to make predictions on new data, enabling its use in real-world regression problems.