Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow passing metrics objects directly to create_metrics_collection #2212

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

ashwinvaidya17
Copy link
Collaborator

📝 Description

To test

from torchmetrics.classification import Accuracy, Precision, Recall

from anomalib.data import MVTec
from anomalib.engine import Engine
from anomalib.models import Padim

if __name__ == "__main__":
    model = Padim()
    data = MVTec()
    engine = Engine(image_metrics=[Accuracy(task="binary"), Precision(task="binary"), Recall(task="binary")])
    engine.train(model, datamodule=data)

✨ Changes

Select what type of change your PR is:

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • 🔨 Refactor (non-breaking change which refactors the code base)
  • 🚀 New feature (non-breaking change which adds functionality)
  • 💥 Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • 📚 Documentation update
  • 🔒 Security update

✅ Checklist

Before you submit your pull request, please make sure you have completed the following steps:

  • 📋 I have summarized my changes in the CHANGELOG and followed the guidelines for my type of change (skip for minor changes, documentation updates, and test enhancements).
  • 📚 I have made the necessary updates to the documentation (if applicable).
  • 🧪 I have written tests that support my changes and prove that my fix is effective or my feature works (if applicable).

For more information about code review checklists, see the Code Review Checklist.

Signed-off-by: Ashwin Vaidya <ashwinnitinvaidya@gmail.com>
Signed-off-by: Ashwin Vaidya <ashwinnitinvaidya@gmail.com>
Copy link

codecov bot commented Aug 21, 2024

Codecov Report

Attention: Patch coverage is 87.50000% with 1 line in your changes missing coverage. Please review.

Project coverage is 80.79%. Comparing base (cfd3d8e) to head (4ce9d62).

Files Patch % Lines
src/anomalib/metrics/__init__.py 87.50% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2212      +/-   ##
==========================================
- Coverage   80.80%   80.79%   -0.01%     
==========================================
  Files         248      248              
  Lines       10859    10864       +5     
==========================================
+ Hits         8775     8778       +3     
- Misses       2084     2086       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Comment on lines +189 to +192
if not (
all(isinstance(metric, str) for metric in metrics) or all(isinstance(metric, Metric) for metric in metrics)
):
msg = f"All metrics must be either string or Metric objects, found {metrics}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would this mean that a user cannot pass the following:

from torchmetrics.classification import Accuracy, Precision, Recall

from anomalib.data import MVTec
from anomalib.engine import Engine
from anomalib.models import Padim

if __name__ == "__main__":
    model = Padim()
    data = MVTec()
    engine = Engine(image_metrics=["F1Score", Accuracy(task="binary")])
    engine.train(model, datamodule=data)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, would it be an idea to have an additional check like;

from torchmetrics.classification import Accuracy, Precision, Recall
import types

def instantiate_if_needed(metric, task="binary"):
    if isinstance(metric, types.FunctionType) or isinstance(metric, type):
        # If metric is a function or a class (not instantiated)
        return metric(task=task)
    else:
        # If metric is already instantiated
        return metric

Or do you think if this is overkill?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants