SKLearner Home | About | Contact | Examples

Configure BaggingRegressor "verbose" Parameter

The verbose parameter in scikit-learn’s BaggingRegressor controls the level of output during model training.

BaggingRegressor is an ensemble method that combines predictions from multiple base estimators trained on random subsets of the original dataset. This technique helps reduce overfitting and improve model generalization.

The verbose parameter determines how much information is displayed during the training process. Higher values provide more detailed output, which can be useful for monitoring progress and debugging.

By default, verbose is set to 0, which means no output is produced during fitting. Common values are 0 (silent), 1 (progress bar), and greater than 1 (more detailed output).

from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.ensemble import BaggingRegressor
from sklearn.metrics import mean_squared_error
import numpy as np

# Generate synthetic dataset
X, y = make_regression(n_samples=1000, n_features=20, noise=0.1, random_state=42)

# Split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Train with different verbose levels
verbose_levels = [0, 1, 2]
mse_scores = []

for level in verbose_levels:
    print(f"\nVerbose level: {level}")
    br = BaggingRegressor(n_estimators=10, verbose=level, random_state=42)
    br.fit(X_train, y_train)
    y_pred = br.predict(X_test)
    mse = mean_squared_error(y_test, y_pred)
    mse_scores.append(mse)
    print(f"Mean Squared Error: {mse:.4f}")

# Compare MSE scores
for level, mse in zip(verbose_levels, mse_scores):
    print(f"Verbose {level}: MSE = {mse:.4f}")

Running the example gives an output like:


Verbose level: 0
Mean Squared Error: 7486.4813

Verbose level: 1
Mean Squared Error: 7486.4813

Verbose level: 2
Building estimator 1 of 10 for this parallel run (total 10)...
Building estimator 2 of 10 for this parallel run (total 10)...
Building estimator 3 of 10 for this parallel run (total 10)...
Building estimator 4 of 10 for this parallel run (total 10)...
Building estimator 5 of 10 for this parallel run (total 10)...
Building estimator 6 of 10 for this parallel run (total 10)...
Building estimator 7 of 10 for this parallel run (total 10)...
Building estimator 8 of 10 for this parallel run (total 10)...
Building estimator 9 of 10 for this parallel run (total 10)...
Building estimator 10 of 10 for this parallel run (total 10)...
Mean Squared Error: 7486.4813
Verbose 0: MSE = 7486.4813
Verbose 1: MSE = 7486.4813
Verbose 2: MSE = 7486.4813

The key steps in this example are:

  1. Generate a synthetic regression dataset
  2. Split the data into train and test sets
  3. Train BaggingRegressor models with different verbose levels
  4. Evaluate the mean squared error of each model on the test set
  5. Compare the output and performance for each verbosity level

Some tips for setting verbose:

Issues to consider:



See Also