-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 Pimo #1557
🚀 Pimo #1557
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! That's a lot of efforts. I have have a few comments.
# Copyright (C) 2022 Intel Corporation | ||
# SPDX-License-Identifier: Apache-2.0 | ||
|
||
from __future__ import annotations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we avoid this? Last I checked __future__.annotations
does not behave well with jsonargparse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i thought it was necessary for the annotations like range: tuple[float, float]
, is it not?
@@ -0,0 +1,113 @@ | |||
"""Binary classification matrix curve (NUMBA implementation of low level functions). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the convention behind naming files starting with _
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's similar to the _
attributes inside a module
a function _func
in module.py
is supposed to be used within the module
scope, right?
similarly, the _validate
is private to per_image/
, so it can be used across other modules inside the subpackage per_image
but not out of it
|
||
|
||
@dataclass | ||
class BinclfAlgorithm: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the advantage of using this over enums?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i just wanted to avoid enums for the sake of simplicity, yet i wanted to put these contants together for consistency
should i switch to enum or is that ok?
|
||
(both in module `torchmetrics.functional.classification.precision_recall_curve` in `torchmetrics==1.1.0`). | ||
|
||
ATTENTION: VALIDATION IS NOT DONE HERE. Make sure to validate the arguments before calling this function. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here. We need to consider how these docstrings will be rendered in sphinx.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how can i check that?
if not force: | ||
raise RuntimeError(msg) | ||
msg += " Computation was forced!" | ||
warnings.warn(msg, RuntimeWarning, stacklevel=1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the advantage of using both logger.warning and warnings.warn. What happens if we want to pipe the warnings to a file and keep the console empty? A file handler can be passed to the logging module in such a scenario.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tbh i didnt know what was the right policy here 😳
i put both cuz i was sure you'd find and tell me what to do :P hehe
so should i use the logger?
on a side note: warnings.warn
allows one to make this raise an exception with warnings.filterwarnings("error")
;
maybe that could be interesting?
for context: this warning is due to having too few points to integrate the AUC curve, which may result in unprecise AUC values
normalization_factor = aupimo_normalizing_factor(fpr_bounds) | ||
aucs = (aucs / normalization_factor).clip(0, 1) | ||
|
||
return threshs, shared_fpr, per_image_tprs, image_classes, aucs, num_points_integral |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think we should capture these in a dataclass?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there is a dataclass for the torch interface but not for the numpy interface
i couldnt figure out a nice, maintainable way to make the two versions
perhaps we can discuss this in a call? i'll explain better and maybe you'll know a good solution
replaced by #1726 |
Replace by #1726
Description
Replaces the PRs from https://gist.github.com/jpcbertoldo/12553b7eaa97cfbf3e55bfd7d1cafe88 to catch up with v1.
Implements refactors from https://github.com/jpcbertoldo/anomalib/blob/metrics/refactors/src/anomalib/utils/metrics/perimg/.refactors .
Changes
Describe the changes you made
Checklist
Ensure that you followed the following