Skip to content

DG: Add early stopping support to LightningModel#178

Merged
avantikalal merged 1 commit intomainfrom
early-stopping
Mar 3, 2026
Merged

DG: Add early stopping support to LightningModel#178
avantikalal merged 1 commit intomainfrom
early-stopping

Conversation

@dagarfield
Copy link
Collaborator

Description

train_params accepts early_stopping, patience, monitor, and mode keys but they were never wired up. This PR connects them to a pytorch_lightning.callbacks.EarlyStopping callback in train_on_dataset.

Changes in src/grelu/lightning/__init__.py:

  • Import EarlyStopping from pytorch_lightning.callbacks
  • Add early_stopping, patience, monitor, and mode to default_train_params (disabled by default)
  • Conditionally append an EarlyStopping callback in train_on_dataset when early_stopping is True

Related Issue(s)

N/A

How to test

Pass early stopping params in train_params and verify training stops when the monitored metric plateaus:

train_params = {
    "task": "regression",
    "loss": "mse",
    "early_stopping": True,
    "patience": 3,
    "monitor": "val_pearson",
    "mode": "max",
    ...
}
model = grelu.lightning.LightningModel(model_params=model_params, train_params=train_params)
trainer = model.train_on_dataset(train_dataset, val_dataset)
# Training should stop early if val_pearson doesn't improve for 3 epochs

Add early_stopping, patience, monitor, and mode to default_train_params
and wire up an EarlyStopping callback in train_on_dataset when enabled.

Co-authored-by: Ona <no-reply@ona.com>
@dagarfield dagarfield marked this pull request as ready for review February 26, 2026 18:46
@dagarfield
Copy link
Collaborator Author

On this same front, any thought for about adding lr_scheduler or similar?
Someone may have just had a bad time with a dilated conv model...

@avantikalal avantikalal merged commit fe8246d into main Mar 3, 2026
1 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants