Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove config from torch inferencer #1001

Merged

Conversation

samet-akcay
Copy link
Contributor

Description

  • Remove config from TorchInferencer

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • Refactor (non-breaking change which refactors the code base)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes
  • I have added a summary of my changes to the CHANGELOG (not for minor changes, docs and tests).

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, my main question is do we want to support exporting to both torch and openvino format within the same run. In that case we would need to make some changes to how we configure the export.

src/anomalib/models/cfa/config.yaml Show resolved Hide resolved
model_source (str | Path | AnomalyModule): Path to the model ckpt file or the Anomaly model.
metadata_path (str | Path, optional): Path to metadata file. If none, it tries to load the params
from the model state_dict. Defaults to None.
path (str | Path | AnomalyModule): Path to the model ckpt file or the Anomaly model.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs to be updated (I think we only support str and Path now, right?)

src/anomalib/deploy/inferencers/torch_inferencer.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have any additional comments.

Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for sorting this out

Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@samet-akcay samet-akcay merged commit 8a88b13 into openvinotoolkit:main Apr 14, 2023
@samet-akcay samet-akcay deleted the remove-config-from-torch-inferencer branch April 14, 2023 11:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants