SKLearner Home | About | Contact | Examples

Configure GradientBoostingClassifier "verbose" Parameter

The verbose parameter in scikit-learn’s GradientBoostingClassifier controls the verbosity of the output during model training.

Gradient Boosting is an ensemble learning method that sequentially adds weak learners (decision trees) to minimize the loss function. The verbose parameter determines how much information is printed during the training process.

The default value for verbose is 0, which means no output is generated during training. Setting verbose to 1 provides basic progress updates, while values greater than 1 give more detailed output.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.metrics import accuracy_score

# Generate synthetic dataset
X, y = make_classification(n_samples=1000, n_classes=3, n_informative=5,
                           n_redundant=0, random_state=42)

# Split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Train with different verbose values
verbose_values = [0, 1, 2]
accuracies = []

for v in verbose_values:
    gb = GradientBoostingClassifier(n_estimators=100, learning_rate=0.1,
                                    verbose=v, random_state=42)
    gb.fit(X_train, y_train)
    y_pred = gb.predict(X_test)
    accuracy = accuracy_score(y_test, y_pred)
    accuracies.append(accuracy)
    print(f"verbose={v}, Accuracy: {accuracy:.3f}")

Running the example gives an output like:

verbose=0, Accuracy: 0.785
      Iter       Train Loss   Remaining Time
         1           1.0081            1.57s
         2           0.9388            1.56s
         3           0.8753            1.66s
         4           0.8244            1.63s
         5           0.7805            1.59s
         6           0.7426            1.56s
         7           0.7088            1.53s
         8           0.6785            1.51s
         9           0.6485            1.49s
        10           0.6239            1.47s
        20           0.4586            1.29s
        30           0.3653            1.13s
        40           0.3046            0.96s
        50           0.2680            0.80s
        60           0.2337            0.64s
        70           0.2080            0.48s
        80           0.1864            0.32s
        90           0.1663            0.16s
       100           0.1490            0.00s
verbose=1, Accuracy: 0.785
      Iter       Train Loss   Remaining Time
         1           1.0081            1.56s
         2           0.9388            1.55s
         3           0.8753            1.54s
         4           0.8244            1.55s
         5           0.7805            1.53s
         6           0.7426            1.52s
         7           0.7088            1.50s
         8           0.6785            1.48s
         9           0.6485            1.46s
        10           0.6239            1.44s
        11           0.6017            1.42s
        12           0.5818            1.40s
        13           0.5618            1.38s
        14           0.5429            1.37s
        15           0.5251            1.35s
        16           0.5095            1.34s
        17           0.4938            1.32s
        18           0.4801            1.30s
        19           0.4679            1.29s
        20           0.4586            1.27s
        21           0.4482            1.26s
        22           0.4378            1.24s
        23           0.4279            1.22s
        24           0.4180            1.21s
        25           0.4079            1.19s
        26           0.3998            1.18s
        27           0.3897            1.16s
        28           0.3814            1.15s
        29           0.3732            1.13s
        30           0.3653            1.12s
        31           0.3587            1.10s
        32           0.3524            1.08s
        33           0.3464            1.07s
        34           0.3390            1.05s
        35           0.3341            1.04s
        36           0.3272            1.02s
        37           0.3218            1.01s
        38           0.3158            0.99s
        39           0.3103            0.97s
        40           0.3046            0.96s
        41           0.3011            0.94s
        42           0.2960            0.93s
        43           0.2921            0.91s
        44           0.2887            0.89s
        45           0.2851            0.88s
        46           0.2821            0.86s
        47           0.2782            0.85s
        48           0.2748            0.83s
        49           0.2714            0.81s
        50           0.2680            0.80s
        51           0.2641            0.78s
        52           0.2600            0.77s
        53           0.2551            0.75s
        54           0.2523            0.73s
        55           0.2481            0.72s
        56           0.2456            0.70s
        57           0.2427            0.69s
        58           0.2405            0.67s
        59           0.2365            0.66s
        60           0.2337            0.64s
        61           0.2317            0.62s
        62           0.2292            0.61s
        63           0.2260            0.59s
        64           0.2237            0.58s
        65           0.2206            0.56s
        66           0.2187            0.54s
        67           0.2160            0.53s
        68           0.2137            0.51s
        69           0.2114            0.50s
        70           0.2080            0.48s
        71           0.2056            0.46s
        72           0.2032            0.45s
        73           0.2016            0.43s
        74           0.1998            0.42s
        75           0.1974            0.40s
        76           0.1955            0.38s
        77           0.1928            0.37s
        78           0.1908            0.35s
        79           0.1885            0.34s
        80           0.1864            0.32s
        81           0.1847            0.30s
        82           0.1828            0.29s
        83           0.1810            0.27s
        84           0.1785            0.26s
        85           0.1756            0.24s
        86           0.1739            0.22s
        87           0.1715            0.21s
        88           0.1704            0.19s
        89           0.1690            0.18s
        90           0.1663            0.16s
        91           0.1643            0.14s
        92           0.1620            0.13s
        93           0.1612            0.11s
        94           0.1593            0.10s
        95           0.1582            0.08s
        96           0.1559            0.06s
        97           0.1551            0.05s
        98           0.1530            0.03s
        99           0.1514            0.02s
       100           0.1490            0.00s
verbose=2, Accuracy: 0.785

The key steps in this example are:

  1. Generate a synthetic multi-class classification dataset
  2. Split the data into train and test sets
  3. Train GradientBoostingClassifier models with different verbose values
  4. Evaluate the accuracy of each model on the test set

Some tips for setting verbose:

Issues to consider:



See Also