-
Notifications
You must be signed in to change notification settings - Fork 92
fix: Manually set uncertainties for fixed parameters to 0 #1919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1919 +/- ##
==========================================
- Coverage 98.17% 98.17% -0.01%
==========================================
Files 68 68
Lines 4331 4330 -1
Branches 730 729 -1
==========================================
- Hits 4252 4251 -1
Misses 46 46
Partials 33 33
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
|
@kratsg @lukasheinrich I'm going to leave this in draft until I have enough time to look at this and scikit-hep/iminuit#762 and determine if the change in uncertainties is desired and create some minimal examples that demonstrate the differences. |
a9ca9ec to
795b697
Compare
alexander-held
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on scikit-hep/iminuit#762 (comment),
pyhf/src/pyhf/optimize/opt_minuit.py
Line 53 in 2cd8c0b
| step_sizes = [(b[1] - b[0]) / float(self.steps) for b in init_bounds] |
I am not yet sure how to handle this in cabinetry but I am leaning towards going through the fit results and manually setting errors for fixed parameters to 0.0 again. I will put my motivation for that into scikit-hep/iminuit#762. It would be best to harmonize the behavior of pyhf and cabinetry though so I'm also interested to hear your thoughts on this so we can ideally agree on the approach.
1f7eb77 to
f1e3bd3
Compare
|
Note to self in case needed in the future: Previous commit message summary was
|
…rning * Add a ignore to filterwarnings to avoid iminuit UserWarning: > UserWarning: Assigned errors must be positive. Non-positive values are replaced by a heuristic. * This was introduced in iminuit v2.12.2 as the result of a step size guess heuristic being applied to fixed parameters. This results in the error of fixed parameters changing during MIGRAD. - c.f. iminuit Issue 762
…tprocess Use tensorlib to work with all backends Use np.where to reduce expensive casting as minuit is numpy-like
5344f79 to
e3a01fd
Compare
alexander-held
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the current implementation allows for leaving the iminuit version requirement unchanged. The change in 2.12.2 affects parameters with an assigned step size of zero. Since pyhf no longer sets a step size of zero to any parameters (and instead corrects the post-fit parameter uncertainties for fixed parameters now), iminuit versions before and after 2.12.2 should behave the same in combination with these changes to pyhf.
The one thing that might break, but for which I do not see a good solution, is user code that directly uses the Minuit instance, reads out .errors on that and gets a different behavior starting with 2.12.2. I don't think pyhf can (or should) do anything about that though.
Yeah the only reason I included the version bump was to ensure that there could only be one behavior if the user tried to access |
This reverts commit 85b8ac8.
As pyhf no longer controls step size, then don't attempt to test for it
The behavior should still be the same with old and new versions, i.e. non-zero |
Exactly. Though this [not being able to force users to install newer versions software] is what you, Henry, Hans, and Thomas have all reminded me about many times and that I have to let users make their own decisions. |
Description
Resolves #1918
iminuit.For
iminuitv2.12.2+"assigning zero to errors is invalid, but in the past this wasnot caught." This approach harmonizes with
cabinetryfor fixed parameters foriminuitv2.12.2+.tests/test_optim.pytest_step_sizes_fixed_parameters_minuitas the uncertainties/step sizes values are no longer set during minimization and so are no longer in pyhf's control.Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: