SKLearner Home | About | Contact | Examples

Configure ElasticNet "warm_start" Parameter

The warm_start parameter in ElasticNet controls whether the solution of a previous fit is reused as the initialization for the next call to fit.

ElasticNet is a linear regression model that combines L1 and L2 regularization. It is particularly useful when there are multiple correlated features.

warm_start can help in scenarios where the model is being retrained on similar data, potentially speeding up convergence.

The default value for warm_start is False.

Common values are True or False, depending on whether iterative fitting is needed.

from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.linear_model import ElasticNet
from sklearn.metrics import mean_squared_error
import numpy as np

# Generate synthetic dataset
X, y = make_regression(n_samples=1000, n_features=20, noise=0.1, random_state=42)

# Split into initial train set and additional batch
X_train, X_new, y_train, y_new = train_test_split(X, y, test_size=0.2, random_state=42)

# Train with warm_start=False
en = ElasticNet(warm_start=False, random_state=42)
en.fit(X_train, y_train)
y_pred_false = en.predict(X_new)
score_false = mean_squared_error(y_new, y_pred_false)
print(f"MSE with warm_start=False: {score_false:.3f}")

# Train with warm_start=True
X_combined = np.concatenate((X_train, X_new))
y_combined = np.concatenate((y_train, y_new))

en.set_params(warm_start=True)
en.fit(X_combined, y_combined)
y_pred_true = en.predict(X_new)
score_true = mean_squared_error(y_new, y_pred_true)
print(f"MSE with warm_start=True: {score_true:.3f}")

Running the example gives an output like:

MSE with warm_start=False: 4638.839
MSE with warm_start=True: 4514.241

The key steps in this example are:

  1. Generate a synthetic regression dataset with noise features
  2. Split the data into train and test sets
  3. Train ElasticNet model normally.
  4. Update the ElasticNet model with new data by setting warm_start to True.

Some tips and heuristics for setting warm_start:

Issues to consider:



See Also