-
Notifications
You must be signed in to change notification settings - Fork 75
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershacktoberfest
Description
✨ What You’ll Do
Right now, Pruna has several evaluation metrics, but we would need a simple MSE based metric.
You can do this by putting images on top of each other and literally looking at the pixel differences at the transitions.
Your task will be to:
- Add a new metric class
MSEunderpruna/metrics/. - Implement the standard metric interface (
update(...),compute(...)). - Integrate this metric into Pruna’s registry so it can be selected in evaluation runs.
- Add tests that validate correctness on toy examples and edge cases.
- Document the new metric and provide a simple usage example.
🤖 Useful Resources
- The pruna documentation: https://docs.pruna.ai/en/stable/docs_pruna/user_manual/customize_metric.html
- Existing metrics in
pruna/metricsas references for structure. - Pruna’s metric registry and how new metrics are registered.
✅ Acceptance Criteria
- It follows the style guidelines.
- The new metric is documented in code and user docs.
- Tests demonstrate that the metric computes expected scores on simple controlled examples.
- The metric integrates properly into Pruna’s evaluation framework.
And don’t forget to give us a ⭐️!
❓ Questions?
Feel free to jump into the #contributing Discord channel if you hit any roadblocks. Can’t wait to see your contribution! 🚀
Share on Socials
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershacktoberfest