Skip to content

Commit

Permalink
🩹 Add early stopping to csflow (#817)
Browse files Browse the repository at this point in the history
  • Loading branch information
ashwinvaidya17 authored Dec 28, 2022
1 parent 8a4e46c commit c82ddcd
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 0 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

### Fixed

- Add early stopping to CS-Flow model (<https://github.com/openvinotoolkit/anomalib/pull/817>)
- Fix remote container by removing version pinning in Docker files (<https://github.com/openvinotoolkit/anomalib/pull/797>)
- Fix PatchCore performance deterioration by reverting changes to Average Pooling layer (<https://github.com/openvinotoolkit/anomalib/pull/791>)
- Fix zero seed (<https://github.com/openvinotoolkit/anomalib/pull/766>)
Expand Down
17 changes: 17 additions & 0 deletions anomalib/models/csflow/lightning_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@

import torch
from omegaconf import DictConfig, ListConfig
from pytorch_lightning.callbacks import EarlyStopping
from pytorch_lightning.utilities.cli import MODEL_REGISTRY
from torch import Tensor

Expand Down Expand Up @@ -103,6 +104,22 @@ def __init__(self, hparams: Union[DictConfig, ListConfig]):
self.hparams: Union[DictConfig, ListConfig] # type: ignore
self.save_hyperparameters(hparams)

def configure_callbacks(self):
"""Configure model-specific callbacks.
Note:
This method is used for the existing CLI.
When PL CLI is introduced, configure callback method will be
deprecated, and callbacks will be configured from either
config.yaml file or from CLI.
"""
early_stopping = EarlyStopping(
monitor=self.hparams.model.early_stopping.metric,
patience=self.hparams.model.early_stopping.patience,
mode=self.hparams.model.early_stopping.mode,
)
return [early_stopping]

def configure_optimizers(self) -> torch.optim.Optimizer:
"""Configures optimizers.
Expand Down

0 comments on commit c82ddcd

Please sign in to comment.