Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Different backend for torchmetric. #4133

Open
eugene123tw opened this issue Nov 27, 2024 · 0 comments
Open

Support Different backend for torchmetric. #4133

eugene123tw opened this issue Nov 27, 2024 · 0 comments

Comments

@eugene123tw
Copy link
Contributor

Is your feature request related to a problem? Please describe.

The torchmetrics.MeanAveragePrecision module now supports the faster-coco-eval backend, which is significantly faster than the default pycocotools backend.

Performance Summary for 5000 Images:

Metric faster-coco-eval (s) pycocotools (s) Speedup (×)
bbox 5.812 22.72 3.91×
segm 7.413 24.434 3.30×

Describe the solution you'd like to propose

Support for configurable backends should be added to the relevant recipes, allowing users to choose between faster-coco-eval and pycocotools.


Describe alternatives you've considered

  • Continuing with the default pycocotools backend despite its slower performance.
  • Using a custom integration for faster-coco-eval outside of recipes.

Additional Context

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant