Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Fixes for PTQ and ACQ based OpenVINO Model Export and Added Test Cases #2594

Open
wants to merge 39 commits into
base: main
Choose a base branch
from

Conversation

srikesh-07
Copy link

@srikesh-07 srikesh-07 commented Mar 8, 2025

📝 Description

✅ Fixes and Refactors


✅ Addressing #2593: Metric Instantiation Issue in Model Export

Modifications to Address the Issue

1. Added metric as a subclass argument in AnomalibCLI

  • Previously, the metric parameter was passed directly as a str in the CLI, causing failures when trying to instantiate it in Model.export.
  • To solve this, the metric parameter is now added as a subclass argument in the AnomalibCLI.

2. Migrated metric from torchmetrics.Metric to anomalib.metrics.AnomalibMetric

  • This PR also aims to migrate the metric's base class from torchmetrics.Metric to anomalib.metrics.AnomalibMetric for seamless integration with the Anomalib ecosystem.

3. Deferred Metric Instantiation in AnomalibCLI

  • Since AnomalibMetric requires a mandatory fields parameter, metric instantiation has been deferred from AnomalibCLI.
  • The metric is now instantiated within the Model.export function using the helper functions provided.

4. Added Helper Functions for Metric Instantiation

  • Introduced helper functions for AnomalibMetric which is similar to AnomalibModel that allow metric instantiation from Model.export via:

    • String (str):
      --metric F1Score
    • Dictionary (dict) input in Python:
      {
          "classpath": "F1Score",
          "init_args": {
              "fields": ["pred_scores", "gt_label"]
          }
      }
    • Namespace (Namespace) passed in the pipeline via CLI.

5. Allowed Dynamic Field Configuration via CLI

  • Users can now pass custom fields via CLI to specify which fields should be used for metric calculation:
    --metric.fields "['pred_scores', 'gt_label']"

6. Handled Missing Fields Gracefully

  • In cases where users do not pass fields explicitly, the metric will use a placeholder:
    fields = [""]
  • The fields will later be modified depending on the model type (e.g., segmentation or classification).

7. Modified ACQ Validation Function

  • The validation function utilized for ACQ is modified to support AnomalibMetric

✅ Test Coverage Enhancements

  • Added unit test cases for helper functions to instantiate the AnomalibMetric.
  • Added integration test cases for OpenVINO export covering all compression types from:
    from anomalib.export.deploy import CompressionType
  • The compression types tested include:
    Compression Type Tested
    FP16 ✅ Yes
    INT8 ✅ Yes
    INT8_PTQ ✅ Yes
    INT8_ACQ ✅ Yes

✅ PR Summary

  • Unblocks Model.export functionality by allowing automatic metric instantiation in OpenVINO export.
  • Improves user experience by allowing users to pass metric fields via CLI or Python without manual intervention.
  • Fixed bugs related to ImageBatch access by switching from subscriptable access (batch['image']) to attribute-style access (batch.image).
  • Ensures seamless compatibility with AnomalibMetric.

🛠️ Fixes

The CHANGELOG.md will be updated once the changes have been approved by the respective authors of the repository.

✨ Changes

Select what type of change your PR is:

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • 🔨 Refactor (non-breaking change which refactors the code base)
  • 🚀 New feature (non-breaking change which adds functionality)
  • 💥 Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • 📚 Documentation update
  • 🔒 Security update

✅ Checklist

Before you submit your pull request, please make sure you have completed the following steps:

  • 📋 I have summarized my changes in the CHANGELOG and followed the guidelines for my type of change (skip for minor changes, documentation updates, and test enhancements).
  • 📚 I have made the necessary updates to the documentation (if applicable).
  • 🧪 I have written tests that support my changes and prove that my fix is effective or my feature works (if applicable).

For more information about code review checklists, see the Code Review Checklist.

PS: This is my first PR to the Anomalib repository, and I sincerely hope that my contributions align with the standards and expectations set by the repository and its authors. I appreciate the time and effort of the authors who will be reviewing my PR, and I thank you in advance for considering my changes.

@samet-akcay @djdameln @ashwinvaidya17

Thank You

Added `AnomalibMetric` as a subclass argument, enabling users to specify any metric derived from `AnomalibMetric` as a command-line argument.

Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Added `use_placeholder_fields` to `get_metric` method which allows to instantiate the AnomalibMetric.

Signed-off-by: Srikesh <[email protected]>
srikesh-07 and others added 11 commits March 11, 2025 12:04
Modified the metric instantiation with the `use_placeholder_fields` argument.

Signed-off-by: Srikesh <[email protected]>
* use image threshold when pixel threshold not available

* add unit tests for post-processor

* use right threshold in forward pass

* add configurable parameters for pp

* docstring

* simplify sensitivity logic

Signed-off-by: Srikesh <[email protected]>
* use image threshold when pixel threshold not available

* add unit tests for post-processor

* use right threshold in forward pass

* add configurable parameters for pp

* docstring

* simplify sensitivity logic

* OneClassPostProcessor -> PostProcessor

* update post-processing docs guide

* rename pre_processor file

* Update src/anomalib/pre_processing/__init__.py

Co-authored-by: Samet Akcay <[email protected]>

* Update src/anomalib/post_processing/post_processor.py

Co-authored-by: Samet Akcay <[email protected]>

* update license headers

---------

Co-authored-by: Samet Akcay <[email protected]>
Signed-off-by: Srikesh <[email protected]>
* Rename MVTec dataset to MVTecAD dataset

Signed-off-by: Samet Akcay <[email protected]>

* Update the test dataset paths

Signed-off-by: Samet Akcay <[email protected]>

* Rename mvtec path to mvtec_ad

Signed-off-by: Samet Akcay <[email protected]>

* Rename mvtec_ad to mvtecad

Signed-off-by: Samet Akcay <[email protected]>

* Path fixes

Signed-off-by: Samet Akcay <[email protected]>

* Path fixes

Signed-off-by: Samet Akcay <[email protected]>

* Fix engine unit tests

Signed-off-by: Samet Akcay <[email protected]>

* Fix mvtec path in inference tests

Signed-off-by: Samet Akcay <[email protected]>

* Rename MVTec dataset path in tests

Signed-off-by: Samet Akcay <[email protected]>

* Fix mvtec path in conftest

Signed-off-by: Samet Akcay <[email protected]>

* remove redundant relative import

Signed-off-by: Samet Akcay <[email protected]>

* Increase the timeout

Signed-off-by: Samet Akcay <[email protected]>

* Fix unit tests

Signed-off-by: Samet Akcay <[email protected]>

---------

Signed-off-by: Samet Akcay <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Signed-off-by: Srikesh <[email protected]>
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@srikesh-07
Copy link
Author

srikesh-07 commented Mar 11, 2025

@samet-akcay @djdameln @ashwinvaidya17

I see the build of ReadtheDocs failed due to a hash mismatch during the pip installation. Can you please check out the issue?

Whenever you get a chance, could you please review my PR? I appreciate your feedback.

Thank you!

Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the efforts. It is a really thorough PR and I appreciate detailed inline documentation. I am fine with the OpenVINO stuff but for the metrics part let's wait for @djdameln 's review as he is the expert here.

@@ -364,6 +364,11 @@ def _accuracy_control_quantization_ov(
msg = "Metric must be provided for OpenVINO INT8_ACQ compression"
raise ValueError(msg)

# Setting up the fields parameter in Metric if Metric is initialized with placeholder.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you mean without a placeholder?

Copy link
Author

@srikesh-07 srikesh-07 Mar 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I’m trying to convey is this:

Let’s say the user passes a metric via the CLI using --metrics f1_score but does not specify any field parameters (i.e., --metrics.fields is not provided). In this case, F1Score is instantiated with a placeholder ([""]) in the model.export method. Later, the metric’s fields are updated based on the model type in export_mixin.py.

I also noticed a couple of grammatical mistakes in an inline comment and a logger statement:

  • In L367:
    Change "if Metric is initialized with placeholder" to if Metric was initialized with a placeholder."

  • In L370:
    Modify the logger statement from:

    "The fields of metric are initialized empty. Setting it to model fields {metric.fields}"
    to:
    "Since the fields of the metric were initialized as empty, they are now being set to the model fields: {metric.fields}."

I hope this clarifies your question. Let me know if you have any further doubts—I’d be happy to help!

Thank you.

>>> from anomalib.metrics import get_available_metrics
>>> metrics = get_available_metrics()
>>> print(sorted(list(metrics))) # doctest: +NORMALIZE_WHITESPACE
['a_u_p_i_m_o', 'a_u_p_r', 'a_u_p_r_o', 'a_u_r_o_c',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe something for a later PR but having a_u_p_i_m_o etc, with underscore might not look nice to the readers

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, I had the same thought. Can we assume that for single-word metrics, we capitalize the entire word, and for those with underscores, we use PascalCase? For example:

  • "aupro""AUPRO"
  • "auroc""AUROC"
  • "dice""DICE" (DICE may be implemented in the future)
  • "f1_score""F1Score"
  • "min_max""MinMax"

Alternatively, we could map each metric in a hashmap within the __init__.py file under anomalib.metrics.

Let me know your thoughts on these approaches.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: ValueError while exporting model in OpenVINO format with PTQ and ACQ
4 participants