The verbose
parameter in scikit-learn’s RandomForestClassifier
controls the verbosity of the training process.
Verbosity refers to the amount of information printed to the console during model fitting. Higher values of verbose
result in more detailed output, while lower values produce less or no output.
The verbose
parameter accepts integer values. A value of 0 (the default) means no verbosity and nothing is printed during training. A value of 1 or higher enables verbosity, with larger numbers resulting in more detailed output.
Common values for verbose
are 0 for no output, 1 for basic progress updates, and 2 or higher for detailed debugging information.
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import accuracy_score
# Generate synthetic dataset
X, y = make_classification(n_samples=10000, n_features=20, n_informative=10,
n_redundant=5, random_state=42)
# Split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Train with different verbose values
verbose_values = [0, 1, 2]
accuracies = []
for v in verbose_values:
rf = RandomForestClassifier(n_estimators=100, verbose=v, random_state=42)
rf.fit(X_train, y_train)
y_pred = rf.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
accuracies.append(accuracy)
print(f"verbose={v}, Accuracy: {accuracy:.3f}")
Running the example gives an output like:
verbose=0, Accuracy: 0.931
[Parallel(n_jobs=1)]: Done 49 tasks | elapsed: 1.2s
[Parallel(n_jobs=1)]: Done 49 tasks | elapsed: 0.0s
verbose=1, Accuracy: 0.931
building tree 1 of 100
building tree 2 of 100
building tree 3 of 100
building tree 4 of 100
building tree 5 of 100
building tree 6 of 100
building tree 7 of 100
building tree 8 of 100
building tree 9 of 100
building tree 10 of 100
building tree 11 of 100
building tree 12 of 100
building tree 13 of 100
building tree 14 of 100
building tree 15 of 100
building tree 16 of 100
building tree 17 of 100
building tree 18 of 100
building tree 19 of 100
building tree 20 of 100
building tree 21 of 100
building tree 22 of 100
building tree 23 of 100
building tree 24 of 100
building tree 25 of 100
building tree 26 of 100
building tree 27 of 100
building tree 28 of 100
building tree 29 of 100
building tree 30 of 100
building tree 31 of 100
building tree 32 of 100
building tree 33 of 100
building tree 34 of 100
building tree 35 of 100
building tree 36 of 100
building tree 37 of 100
building tree 38 of 100
building tree 39 of 100
building tree 40 of 100
[Parallel(n_jobs=1)]: Done 40 tasks | elapsed: 1.0s
building tree 41 of 100
building tree 42 of 100
building tree 43 of 100
building tree 44 of 100
building tree 45 of 100
building tree 46 of 100
building tree 47 of 100
building tree 48 of 100
building tree 49 of 100
building tree 50 of 100
building tree 51 of 100
building tree 52 of 100
building tree 53 of 100
building tree 54 of 100
building tree 55 of 100
building tree 56 of 100
building tree 57 of 100
building tree 58 of 100
building tree 59 of 100
building tree 60 of 100
building tree 61 of 100
building tree 62 of 100
building tree 63 of 100
building tree 64 of 100
building tree 65 of 100
building tree 66 of 100
building tree 67 of 100
building tree 68 of 100
building tree 69 of 100
building tree 70 of 100
building tree 71 of 100
building tree 72 of 100
building tree 73 of 100
building tree 74 of 100
building tree 75 of 100
building tree 76 of 100
building tree 77 of 100
building tree 78 of 100
building tree 79 of 100
building tree 80 of 100
building tree 81 of 100
building tree 82 of 100
building tree 83 of 100
building tree 84 of 100
building tree 85 of 100
building tree 86 of 100
building tree 87 of 100
building tree 88 of 100
building tree 89 of 100
building tree 90 of 100
building tree 91 of 100
building tree 92 of 100
building tree 93 of 100
building tree 94 of 100
building tree 95 of 100
building tree 96 of 100
building tree 97 of 100
building tree 98 of 100
building tree 99 of 100
building tree 100 of 100
[Parallel(n_jobs=1)]: Done 40 tasks | elapsed: 0.0s
verbose=2, Accuracy: 0.931
The key steps in this example are:
- Generate a synthetic classification dataset with informative and redundant features
- Split the data into train and test sets
- Train
RandomForestClassifier
models with differentverbose
values - Evaluate the accuracy of each model on the test set
Tips and heuristics for setting verbose
:
- Use
verbose=0
(default) for clean output and in production environments - Set
verbose=1
to monitor progress on large datasets or complex models - Use
verbose=2
or higher when debugging to get detailed information
Issues to consider:
- Higher verbosity can slightly impact runtime performance due to the overhead of printing
- Excessive verbosity can clutter the output and make it harder to parse important information
- In notebooks or IDEs, high verbosity can fill up the console quickly