Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Smart update of Collection of CompositionalMetrics #143

Closed
victorjoos opened this issue Mar 29, 2021 · 3 comments · Fixed by #709
Closed

Smart update of Collection of CompositionalMetrics #143

victorjoos opened this issue Mar 29, 2021 · 3 comments · Fixed by #709
Assignees
Labels
enhancement New feature or request Important milestonish
Milestone

Comments

@victorjoos
Copy link
Contributor

victorjoos commented Mar 29, 2021

🚀 Feature

When updating metrics that are composed of other metrics there are two ways of dealing with updating too many times :

I don't think there is a clean way of only updating the necessary metrics in the general case (when you're just updating all the metrics yourself), but I think that when you combine your metrics in a collection, it could be useful to only update the "base" metric, instead of all metrics.

Motivation

I often want to use a base metric multiple times, and then I have to be careful not to update too many of them. A somewhat convoluted example (because the f1 score is already implemented) :

prec = Precision()
recall = Recall()
f1 = 2 * (prec * recall) / (prec + recall)
prec.update(pred, gt)
recall.update(pred, gt)
f1.update(pred, gt) # Shouldn't do this, because it updates prec and recall twice. 

Pitch

Continuing last example :

collection = MetricCollection([prec, recall, f1])
collection.update(pred, gt)

This should only update prec and recall once.

Alternatives

The alternative is to always define metrics from scratch, but this causes duplication of computation during the update phase.

@victorjoos victorjoos added enhancement New feature or request help wanted Extra attention is needed labels Mar 29, 2021
@Borda
Copy link
Member

Borda commented Mar 29, 2021

@justusschock what do you think about it? ;]

@SkafteNicki
Copy link
Member

I had a similar proposal here: #76

@justusschock
Copy link
Member

@Borda Yes, this is necessary at some point. The only question here is: Should we define "compatible" metrics, that share their internal state or should this be defined by the user?

IF we define it, we have to maintain this list, but we know about the implementation details of the metrics. If the user defines it, we don't need to care about that list but expect the user to know about our internals.

@stale stale bot added the wontfix label Jun 1, 2021
@Lightning-AI Lightning-AI deleted a comment from stale bot Jun 1, 2021
@stale stale bot removed the wontfix label Jun 1, 2021
@Borda Borda added this to the v0.5 milestone Jul 2, 2021
@Borda Borda modified the milestones: v0.5, v0.6 Aug 8, 2021
@Borda Borda removed the help wanted Extra attention is needed label Sep 20, 2021
@Borda Borda modified the milestones: v0.6, v0.7 Oct 22, 2021
@Borda Borda modified the milestones: v0.7, v0.8 Jan 4, 2022
@Borda Borda added the Important milestonish label Jan 4, 2022
@Borda Borda pinned this issue Jan 4, 2022
@Borda Borda unpinned this issue Jan 7, 2022
@Borda Borda closed this as completed in #709 Feb 8, 2022
@Borda Borda added this to the v0.8 milestone May 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Important milestonish
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants