SKLearner Home | About | Contact | Examples

Configure SGDClassifier "power_t" Parameter

The power_t parameter in scikit-learn’s SGDClassifier controls the learning rate decay during training iterations.

SGDClassifier uses stochastic gradient descent for optimization, updating the model’s parameters after each sample. The power_t parameter determines how quickly the learning rate decreases over time.

Higher values of power_t cause the learning rate to decay more rapidly, while lower values result in slower decay. This can significantly impact the model’s convergence and final performance.

The default value for power_t is 0.5, which corresponds to an inverse scaling learning rate. Common values range from 0 to 1.

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
from sklearn.linear_model import SGDClassifier
from sklearn.metrics import accuracy_score

# Generate synthetic dataset
X, y = make_classification(n_samples=1000, n_features=20, n_informative=10,
                           n_redundant=5, n_classes=2, random_state=42)

# Split into train and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Train with different power_t values
power_t_values = [0.1, 0.5, 0.9]
accuracies = []

for power_t in power_t_values:
    sgd = SGDClassifier(power_t=power_t, random_state=42, max_iter=1000)
    sgd.fit(X_train, y_train)
    y_pred = sgd.predict(X_test)
    accuracy = accuracy_score(y_test, y_pred)
    accuracies.append(accuracy)
    print(f"power_t={power_t}, Accuracy: {accuracy:.3f}")

Running the example gives an output like:

power_t=0.1, Accuracy: 0.770
power_t=0.5, Accuracy: 0.770
power_t=0.9, Accuracy: 0.770

The key steps in this example are:

  1. Generate a synthetic binary classification dataset
  2. Split the data into train and test sets
  3. Train SGDClassifier models with different power_t values
  4. Evaluate the accuracy of each model on the test set

Some tips and heuristics for setting power_t:

Issues to consider:



See Also