SKLearner Home | About | Contact | Examples

Get RandomForestClassifier "oob_score_" Attribute

The RandomForestClassifier in scikit-learn is an ensemble learning algorithm that combines multiple decision trees to make predictions. It builds a forest of decision trees, each trained on a bootstrap sample of the original training data, and aggregates their predictions to make the final classification.

The oob_score_ attribute of a fitted RandomForestClassifier represents the Out-of-Bag (OOB) score, which is a measure of the model’s performance on samples that were not included in the bootstrap samples used to train each tree. During the training process, each tree in the forest is fitted on a bootstrap sample, leaving out about one-third of the original samples. These left-out samples are used to evaluate the tree’s performance, and the OOB score is the average accuracy of all the trees on their respective OOB samples.

Accessing the oob_score_ attribute can be useful for assessing the generalization ability of the Random Forest model without the need for a separate validation set. It provides an unbiased estimate of the model’s performance on unseen data, as the OOB samples were not used in the training of the trees. This information can be valuable for model selection, hyperparameter tuning, and understanding the model’s overall performance.

from sklearn.datasets import make_classification
from sklearn.ensemble import RandomForestClassifier

# Generate a synthetic classification dataset
X, y = make_classification(n_samples=1000, n_features=10, n_informative=5, n_redundant=2,
                           n_classes=3, random_state=42)

# Initialize a RandomForestClassifier with OOB score calculation enabled
rf = RandomForestClassifier(n_estimators=100, oob_score=True, random_state=42)

# Fit the classifier on the entire dataset
rf.fit(X, y)

# Retrieve the OOB score from the fitted classifier
oob_score = rf.oob_score_
print(f"OOB Score: {oob_score:.3f}")

Running the example gives an output like:

OOB Score: 0.853

The key steps in this example are:

  1. Generate a synthetic classification dataset using make_classification with specified parameters.
  2. Initialize a RandomForestClassifier with oob_score=True to enable the calculation of the OOB score.
  3. Fit the classifier on the entire dataset, as OOB evaluation does not require a separate validation set.
  4. Access the oob_score_ attribute from the fitted classifier and print its value.


See Also