SKLearner Home | About | Contact | Examples

Scikit-Learn Configure GridSearchCV "refit" Parameter

The ‘refit’ parameter in scikit-learn’s GridSearchCV determines whether the best model found during the grid search is refitted on the entire dataset.

This parameter can be set to True, False, or a specific scoring metric. Using ‘refit’ effectively ensures that the final model is optimized according to your priorities, whether that be based on a single metric or multiple criteria.

Grid search is a method for exhaustively searching over a specified set of parameter values to find the best combination. It trains and evaluates the model for each combination of parameters, using the specified scoring metric to compare performance.

The ‘refit’ parameter can be set to:

from sklearn.datasets import make_classification
from sklearn.svm import SVC
from sklearn.model_selection import GridSearchCV

# create a synthetic dataset
X, y = make_classification(n_samples=1000, n_classes=2, random_state=42)

# define the parameter grid
param_grid = {'C': [0.1, 1, 10], 'gamma': [1, 0.1, 0.01]}

# define and perform a grid search with refit=True
grid_refit_true = GridSearchCV(estimator=SVC(), param_grid=param_grid, refit=True, cv=5)
grid_refit_true.fit(X, y)

# define and perform a grid search with refit=False
grid_refit_false = GridSearchCV(estimator=SVC(), param_grid=param_grid, refit=False, cv=5)
grid_refit_false.fit(X, y)

# define and perform a grid search with refit set to 'f1' scoring metric
grid_refit_f1 = GridSearchCV(estimator=SVC(), param_grid=param_grid, refit='f1', scoring='f1', cv=5)
grid_refit_f1.fit(X, y)

# report the best parameters and models
print("Best parameters with refit=True:")
print(grid_refit_true.best_params_)
print("Best model with refit=True:")
print(grid_refit_true.best_estimator_)

print("Best parameters with refit=False (Note: No refitting):")
print(grid_refit_false.best_params_)

print("Best parameters with refit='f1':")
print(grid_refit_f1.best_params_)
print("Best model with refit='f1':")
print(grid_refit_f1.best_estimator_)

Running the example gives an output like:

Best parameters with refit=True:
{'C': 1, 'gamma': 0.01}
Best model with refit=True:
SVC(C=1, gamma=0.01)
Best parameters with refit=False (Note: No refitting):
{'C': 1, 'gamma': 0.01}
Best parameters with refit='f1':
{'C': 1, 'gamma': 0.01}
Best model with refit='f1':
SVC(C=1, gamma=0.01)

The key steps in this example are:

  1. Generate a synthetic binary classification dataset using make_classification.
  2. Define a parameter grid for SVC with C and gamma values to search over.
  3. Create a GridSearchCV object with refit=True and fit it to find the best model automatically refitted on the entire dataset.
  4. Create a GridSearchCV object with refit=False and fit it, noting that no refitting will be done.
  5. Create a GridSearchCV object with refit='f1' and fit it to find the best model based on the ‘f1’ scoring metric.
  6. Print out the best parameters and models found for each grid search configuration, illustrating how the ‘refit’ parameter affects the final model selection.

This demonstrates the flexibility of the ‘refit’ parameter in GridSearchCV, allowing you to control whether and how the best model is refitted based on your specific needs and priorities.



See Also