Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking β€œSign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT MERGE] - Release v2.0.0 #2465

Open
wants to merge 42 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
660be42
Dataclasses and post-processing refactor (#2098)
djdameln Sep 2, 2024
2e6e18a
Merge main and resolve conflicts (#2287)
samet-akcay Sep 2, 2024
ec5a877
Rename Item to DatasetItem (#2289)
samet-akcay Sep 2, 2024
1500db6
πŸ“š Add docstrings to dataclasses (#2292)
samet-akcay Sep 6, 2024
627be88
Refactor and restructure anomalib.data (#2302)
samet-akcay Sep 11, 2024
8543e24
Restructure unit tests and fix ruff issues (#2306)
samet-akcay Sep 18, 2024
99b4e9d
Add dataclass validators (#2307)
samet-akcay Oct 2, 2024
06daad9
πŸš€ Customisable Image Visualizer (#2334)
samet-akcay Oct 9, 2024
95115f9
Merge main the feature branch. (#2376)
samet-akcay Oct 16, 2024
6e1d870
Metrics redesign (#2326)
djdameln Nov 7, 2024
dddf707
πŸš€ Add `PreProcessor` to `AnomalyModule` (#2358)
samet-akcay Nov 8, 2024
1471974
Update v2 with the recent changes on main (#2421)
djdameln Nov 15, 2024
c16f51e
Rename `AnomalyModule` to `AnomalibModule` (#2423)
samet-akcay Nov 22, 2024
2f3d616
πŸ”¨ Replace `imgaug` with Native PyTorch Transforms (#2436)
samet-akcay Nov 27, 2024
00b01b1
[V2]: Remove task type (#2450)
djdameln Dec 5, 2024
c73e411
πŸ—‘οΈ Remove RKDE (#2455)
ashwinvaidya17 Dec 6, 2024
8bd06a9
Multi-GPU fixes (#2435)
ashwinvaidya17 Dec 10, 2024
c235ca1
πŸ”¨ v2 - Refactor: Add missing auxiliary attributes to `AnomalibModule`…
samet-akcay Dec 11, 2024
7116fec
πŸ“šUpdate `CHANGELOG.md` file for v2.0.0 release (#2463)
samet-akcay Dec 11, 2024
244f50b
πŸš€ Create a new CI Pipeline (#2461)
samet-akcay Dec 11, 2024
324e181
Resolve conflicts
samet-akcay Dec 11, 2024
31ad999
Exclude tiled ensemble for now
samet-akcay Dec 11, 2024
4635158
Install the required pytest plugins
samet-akcay Dec 11, 2024
dda0421
Disable parallel execution for now
samet-akcay Dec 11, 2024
d4c8f82
Fix the status of the unit tests
samet-akcay Dec 12, 2024
b8455b4
Automatically set the device to run the unit/integration tests
samet-akcay Dec 12, 2024
40d27e9
Enhance the device management for unit/integration tests
samet-akcay Dec 12, 2024
a960a18
Add cpu and gpu markers to tests to be able to explicitly choose them
samet-akcay Dec 12, 2024
6e98863
Handle duration, and fix failed tests status
samet-akcay Dec 12, 2024
2d44435
Add gpu marker to the tests that require gpu device
samet-akcay Dec 12, 2024
1a7976f
Enhance the error message on tests
samet-akcay Dec 12, 2024
4e0ec91
pass pytest native marker to trigger cpu and gpu tests
samet-akcay Dec 12, 2024
1b62c34
Remove gpu marker
samet-akcay Dec 12, 2024
80007fe
Create a new caching pipeline
samet-akcay Dec 12, 2024
4b9f6fb
Add enable-cache as an input argument
samet-akcay Dec 12, 2024
f4646cd
Add enable-cache as an input argument
samet-akcay Dec 12, 2024
f245ace
pass cache none for self-hosted runner
samet-akcay Dec 12, 2024
461f6d7
fix the warning messages
samet-akcay Dec 12, 2024
9887c94
Map semgrep severity level keys
samet-akcay Dec 12, 2024
7251739
Add coverage args
samet-akcay Dec 12, 2024
58453ef
πŸ“š Update Documentation and Docstrings (#2468)
samet-akcay Dec 19, 2024
a532cb5
Version bump to `v2.0.0-beta.1` (#2472)
samet-akcay Dec 20, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Rename AnomalyModule to AnomalibModule (#2423)
* Rename AnomalyModule to AnomalibModule

Signed-off-by: Samet Akcay <samet.akcay@intel.com>

* Ignore AnomalyModule from the tests

Signed-off-by: Samet Akcay <samet.akcay@intel.com>

---------

Signed-off-by: Samet Akcay <samet.akcay@intel.com>
  • Loading branch information
samet-akcay authored Nov 22, 2024
commit c16f51ed5b118b753e7b3dd95e6ba45e3b2388a8
2 changes: 1 addition & 1 deletion docs/source/markdown/guides/developer/sdd.md
Original file line number Diff line number Diff line change
@@ -201,7 +201,7 @@ and depth data.

Anomalib provides a collection of anomaly models within the image and video
domains. The models are implemented sub-classing PyTorch Lightning's
`LightningModule` class, which is called `AnomalyModule`, which provides a set
`LightningModule` class, which is called `AnomalibModule`, which provides a set
of APIs for defining the model architecture, loss function, and optimization
algorithm. The models are designed to be modular and extensible, allowing users
to easily modify the model architecture and training workflow based on their
4 changes: 2 additions & 2 deletions src/anomalib/callbacks/model_loader.py
Original file line number Diff line number Diff line change
@@ -8,7 +8,7 @@
import torch
from lightning.pytorch import Callback, Trainer

from anomalib.models.components import AnomalyModule
from anomalib.models.components import AnomalibModule

logger = logging.getLogger(__name__)

@@ -27,7 +27,7 @@ class LoadModelCallback(Callback):
def __init__(self, weights_path: str) -> None:
self.weights_path = weights_path

def setup(self, trainer: Trainer, pl_module: AnomalyModule, stage: str | None = None) -> None:
def setup(self, trainer: Trainer, pl_module: AnomalibModule, stage: str | None = None) -> None:
"""Call when inference begins.

Loads the model weights from ``weights_path`` into the PyTorch module.
4 changes: 2 additions & 2 deletions src/anomalib/callbacks/tiler_configuration.py
Original file line number Diff line number Diff line change
@@ -9,7 +9,7 @@
from lightning.pytorch.callbacks import Callback

from anomalib.data.utils.tiler import ImageUpscaleMode, Tiler
from anomalib.models.components import AnomalyModule
from anomalib.models.components import AnomalibModule

__all__ = ["TilerConfigurationCallback"]

@@ -61,7 +61,7 @@ def setup(self, trainer: pl.Trainer, pl_module: pl.LightningModule, stage: str |
del trainer, stage # These variables are not used.

if self.enable:
if isinstance(pl_module, AnomalyModule) and hasattr(pl_module.model, "tiler"):
if isinstance(pl_module, AnomalibModule) and hasattr(pl_module.model, "tiler"):
pl_module.model.tiler = Tiler(
tile_size=self.tile_size,
stride=self.stride,
14 changes: 7 additions & 7 deletions src/anomalib/callbacks/visualizer.py
Original file line number Diff line number Diff line change
@@ -16,7 +16,7 @@
from anomalib.data.utils.image import save_image, show_image
from anomalib.loggers import AnomalibWandbLogger
from anomalib.loggers.base import ImageLoggerBase
from anomalib.models import AnomalyModule
from anomalib.models import AnomalibModule
from anomalib.utils.visualization import (
BaseVisualizer,
GeneratorResult,
@@ -77,7 +77,7 @@ def __init__(
def on_test_batch_end(
self,
trainer: Trainer,
pl_module: AnomalyModule,
pl_module: AnomalibModule,
outputs: STEP_OUTPUT | None,
batch: Any, # noqa: ANN401
batch_idx: int,
@@ -114,7 +114,7 @@ def on_test_batch_end(
if self.log:
self._add_to_logger(result, pl_module, trainer)

def on_test_end(self, trainer: Trainer, pl_module: AnomalyModule) -> None:
def on_test_end(self, trainer: Trainer, pl_module: AnomalibModule) -> None:
for generator in self.generators:
if generator.visualize_on == VisualizationStep.STAGE_END:
for result in generator(trainer=trainer, pl_module=pl_module):
@@ -135,28 +135,28 @@ def on_test_end(self, trainer: Trainer, pl_module: AnomalyModule) -> None:
def on_predict_batch_end(
self,
trainer: Trainer,
pl_module: AnomalyModule,
pl_module: AnomalibModule,
outputs: STEP_OUTPUT | None,
batch: Any, # noqa: ANN401
batch_idx: int,
dataloader_idx: int = 0,
) -> None:
return self.on_test_batch_end(trainer, pl_module, outputs, batch, batch_idx, dataloader_idx)

def on_predict_end(self, trainer: Trainer, pl_module: AnomalyModule) -> None:
def on_predict_end(self, trainer: Trainer, pl_module: AnomalibModule) -> None:
return self.on_test_end(trainer, pl_module)

@staticmethod
def _add_to_logger(
result: GeneratorResult,
module: AnomalyModule,
module: AnomalibModule,
trainer: Trainer,
) -> None:
"""Add image to logger.

Args:
result (GeneratorResult): Output from the generators.
module (AnomalyModule): LightningModule from which the global step is extracted.
module (AnomalibModule): LightningModule from which the global step is extracted.
trainer (Trainer): Trainer object.
"""
# Store names of logger and the logger in a dict
10 changes: 5 additions & 5 deletions src/anomalib/cli/cli.py
Original file line number Diff line number Diff line change
@@ -30,7 +30,7 @@

from anomalib.data import AnomalibDataModule
from anomalib.engine import Engine
from anomalib.models import AnomalyModule
from anomalib.models import AnomalibModule
from anomalib.utils.config import update_config

except ImportError:
@@ -166,7 +166,7 @@ def add_trainer_arguments(self, parser: ArgumentParser, subcommand: str) -> None
self._add_default_arguments_to_parser(parser)
self._add_trainer_arguments_to_parser(parser, add_optimizer=True, add_scheduler=True)
parser.add_subclass_arguments(
AnomalyModule,
AnomalibModule,
"model",
fail_untyped=False,
required=True,
@@ -186,7 +186,7 @@ def add_train_arguments(self, parser: ArgumentParser) -> None:
self._add_default_arguments_to_parser(parser)
self._add_trainer_arguments_to_parser(parser, add_optimizer=True, add_scheduler=True)
parser.add_subclass_arguments(
AnomalyModule,
AnomalibModule,
"model",
fail_untyped=False,
required=True,
@@ -205,7 +205,7 @@ def add_predict_arguments(self, parser: ArgumentParser) -> None:
self._add_default_arguments_to_parser(parser)
self._add_trainer_arguments_to_parser(parser)
parser.add_subclass_arguments(
AnomalyModule,
AnomalibModule,
"model",
fail_untyped=False,
required=True,
@@ -228,7 +228,7 @@ def add_export_arguments(self, parser: ArgumentParser) -> None:
self._add_default_arguments_to_parser(parser)
self._add_trainer_arguments_to_parser(parser)
parser.add_subclass_arguments(
AnomalyModule,
AnomalibModule,
"model",
fail_untyped=False,
required=True,
56 changes: 28 additions & 28 deletions src/anomalib/engine/engine.py
Original file line number Diff line number Diff line change
@@ -20,7 +20,7 @@
from anomalib.callbacks.timer import TimerCallback
from anomalib.data import AnomalibDataModule, AnomalibDataset, PredictDataset
from anomalib.deploy import CompressionType, ExportType
from anomalib.models import AnomalyModule
from anomalib.models import AnomalibModule
from anomalib.utils.path import create_versioned_dir
from anomalib.visualization import ImageVisualizer

@@ -64,11 +64,11 @@ class _TrainerArgumentsCache:
def __init__(self, **kwargs) -> None:
self._cached_args = {**kwargs}

def update(self, model: AnomalyModule) -> None:
def update(self, model: AnomalibModule) -> None:
"""Replace cached arguments with arguments retrieved from the model.

Args:
model (AnomalyModule): The model used for training
model (AnomalibModule): The model used for training
"""
for key, value in model.trainer_arguments.items():
if key in self._cached_args and self._cached_args[key] != value:
@@ -77,7 +77,7 @@ def update(self, model: AnomalyModule) -> None:
)
self._cached_args[key] = value

def requires_update(self, model: AnomalyModule) -> bool:
def requires_update(self, model: AnomalibModule) -> bool:
return any(self._cached_args.get(key, None) != value for key, value in model.trainer_arguments.items())

@property
@@ -152,14 +152,14 @@ def trainer(self) -> Trainer:
return self._trainer

@property
def model(self) -> AnomalyModule:
def model(self) -> AnomalibModule:
"""Property to get the model.

Raises:
UnassignedError: When the model is not assigned yet.

Returns:
AnomalyModule: Anomaly model.
AnomalibModule: Anomaly model.
"""
if not self.trainer.lightning_module:
msg = "Trainer does not have a model assigned yet."
@@ -190,7 +190,7 @@ def best_model_path(self) -> str | None:

def _setup_workspace(
self,
model: AnomalyModule,
model: AnomalibModule,
train_dataloaders: TRAIN_DATALOADERS | None = None,
val_dataloaders: EVAL_DATALOADERS | None = None,
test_dataloaders: EVAL_DATALOADERS | None = None,
@@ -205,7 +205,7 @@ def _setup_workspace(
other artifacts will be saved in this directory.

Args:
model (AnomalyModule): Input model.
model (AnomalibModule): Input model.
train_dataloaders (TRAIN_DATALOADERS | None, optional): Train dataloaders.
Defaults to ``None``.
val_dataloaders (EVAL_DATALOADERS | None, optional): Validation dataloaders.
@@ -255,7 +255,7 @@ def _setup_workspace(
root_dir = Path(self._cache.args["default_root_dir"]) / model.name / dataset_name / category
self._cache.args["default_root_dir"] = create_versioned_dir(root_dir) if versioned_dir else root_dir / "latest"

def _setup_trainer(self, model: AnomalyModule) -> None:
def _setup_trainer(self, model: AnomalibModule) -> None:
"""Instantiate the trainer based on the model parameters."""
# Check if the cache requires an update
if self._cache.requires_update(model):
@@ -291,7 +291,7 @@ def _setup_dataset_task(
)
data.task = self.task

def _setup_anomalib_callbacks(self, model: AnomalyModule) -> None:
def _setup_anomalib_callbacks(self, model: AnomalibModule) -> None:
"""Set up callbacks for the trainer."""
_callbacks: list[Callback] = []

@@ -325,7 +325,7 @@ def _setup_anomalib_callbacks(self, model: AnomalyModule) -> None:

@staticmethod
def _should_run_validation(
model: AnomalyModule,
model: AnomalibModule,
ckpt_path: str | Path | None,
) -> bool:
"""Check if we need to run validation to collect normalization statistics and thresholds.
@@ -341,7 +341,7 @@ def _should_run_validation(
are available. If neither is available, we can't run validation.

Args:
model (AnomalyModule): Model passed to the entrypoint.
model (AnomalibModule): Model passed to the entrypoint.
dataloaders (EVAL_DATALOADERS | None): Dataloaders passed to the entrypoint.
datamodule (AnomalibDataModule | None): Lightning datamodule passed to the entrypoint.
ckpt_path (str | Path | None): Checkpoint path passed to the entrypoint.
@@ -357,7 +357,7 @@ def _should_run_validation(

def fit(
self,
model: AnomalyModule,
model: AnomalibModule,
train_dataloaders: TRAIN_DATALOADERS | None = None,
val_dataloaders: EVAL_DATALOADERS | None = None,
datamodule: AnomalibDataModule | None = None,
@@ -366,7 +366,7 @@ def fit(
"""Fit the model using the trainer.

Args:
model (AnomalyModule): Model to be trained.
model (AnomalibModule): Model to be trained.
train_dataloaders (TRAIN_DATALOADERS | None, optional): Train dataloaders.
Defaults to None.
val_dataloaders (EVAL_DATALOADERS | None, optional): Validation dataloaders.
@@ -411,7 +411,7 @@ def fit(

def validate(
self,
model: AnomalyModule | None = None,
model: AnomalibModule | None = None,
dataloaders: EVAL_DATALOADERS | None = None,
ckpt_path: str | Path | None = None,
verbose: bool = True,
@@ -420,7 +420,7 @@ def validate(
"""Validate the model using the trainer.

Args:
model (AnomalyModule | None, optional): Model to be validated.
model (AnomalibModule | None, optional): Model to be validated.
Defaults to None.
dataloaders (EVAL_DATALOADERS | None, optional): Dataloaders to be used for
validation.
@@ -460,7 +460,7 @@ def validate(

def test(
self,
model: AnomalyModule | None = None,
model: AnomalibModule | None = None,
dataloaders: EVAL_DATALOADERS | None = None,
ckpt_path: str | Path | None = None,
verbose: bool = True,
@@ -472,7 +472,7 @@ def test(
finally tests the model.

Args:
model (AnomalyModule | None, optional):
model (AnomalibModule | None, optional):
The model to be tested.
Defaults to None.
dataloaders (EVAL_DATALOADERS | None, optional):
@@ -545,7 +545,7 @@ def test(
if model:
self._setup_trainer(model)
elif not self.model:
msg = "`Engine.test()` requires an `AnomalyModule` when it hasn't been passed in a previous run."
msg = "`Engine.test()` requires an `AnomalibModule` when it hasn't been passed in a previous run."
raise RuntimeError(msg)

self._setup_dataset_task(dataloaders)
@@ -556,7 +556,7 @@ def test(

def predict(
self,
model: AnomalyModule | None = None,
model: AnomalibModule | None = None,
dataloaders: EVAL_DATALOADERS | None = None,
datamodule: AnomalibDataModule | None = None,
dataset: Dataset | PredictDataset | None = None,
@@ -570,7 +570,7 @@ def predict(
validation dataloader is available. Finally, predicts using the model.

Args:
model (AnomalyModule | None, optional):
model (AnomalibModule | None, optional):
Model to be used for prediction.
Defaults to None.
dataloaders (EVAL_DATALOADERS | None, optional):
@@ -623,7 +623,7 @@ def predict(
```
"""
if not (model or self.model):
msg = "`Engine.predict()` requires an `AnomalyModule` when it hasn't been passed in a previous run."
msg = "`Engine.predict()` requires an `AnomalibModule` when it hasn't been passed in a previous run."
raise ValueError(msg)

if ckpt_path:
@@ -668,7 +668,7 @@ def predict(

def train(
self,
model: AnomalyModule,
model: AnomalibModule,
train_dataloaders: TRAIN_DATALOADERS | None = None,
val_dataloaders: EVAL_DATALOADERS | None = None,
test_dataloaders: EVAL_DATALOADERS | None = None,
@@ -678,7 +678,7 @@ def train(
"""Fits the model and then calls test on it.

Args:
model (AnomalyModule): Model to be trained.
model (AnomalibModule): Model to be trained.
train_dataloaders (TRAIN_DATALOADERS | None, optional): Train dataloaders.
Defaults to None.
val_dataloaders (EVAL_DATALOADERS | None, optional): Validation dataloaders.
@@ -731,7 +731,7 @@ def train(

def export(
self,
model: AnomalyModule,
model: AnomalibModule,
export_type: ExportType | str,
export_root: str | Path | None = None,
input_size: tuple[int, int] | None = None,
@@ -744,7 +744,7 @@ def export(
r"""Export the model in PyTorch, ONNX or OpenVINO format.

Args:
model (AnomalyModule): Trained model.
model (AnomalibModule): Trained model.
export_type (ExportType): Export type.
export_root (str | Path | None, optional): Path to the output directory. If it is not set, the model is
exported to trainer.default_root_dir. Defaults to None.
@@ -832,15 +832,15 @@ def from_config(
cls: type["Engine"],
config_path: str | Path,
**kwargs,
) -> tuple["Engine", AnomalyModule, AnomalibDataModule]:
) -> tuple["Engine", AnomalibModule, AnomalibDataModule]:
"""Create an Engine instance from a configuration file.

Args:
config_path (str | Path): Path to the full configuration file.
**kwargs (dict): Additional keyword arguments.

Returns:
tuple[Engine, AnomalyModule, AnomalibDataModule]: Engine instance.
tuple[Engine, AnomalibModule, AnomalibDataModule]: Engine instance.

Example:
The following example shows training with full configuration file:
Loading
Oops, something went wrong.