Align behavior of random parameters with numpy.random.#3738
Align behavior of random parameters with numpy.random.#3738
Conversation
tomtetzlaff
left a comment
There was a problem hiding this comment.
Thanks a lot, @heplesser, for this fast fix. All three functions nest.random.normal(), nest.random.lognormal() and nest.random.exponential() behave as expected now.
terhorstd
left a comment
There was a problem hiding this comment.
Looks like a good change. For consistency we could briefly check testing strategy for the changed behavior covers also new cases.
| ("lognormal", {"mu": 0.0, "std": -1.0}), | ||
| ("lognormal", {"mu": 0.0, "std": 1.0}), | ||
| ("exponential", {"beta": -1.0}), | ||
| ] |
There was a problem hiding this comment.
Are acceptable parameter values also tested? I'd have expected also a good_random_parameters = […, ("normal", {"mean": 1.0, "std": 0.0}, 1.0)]
There was a problem hiding this comment.
After considering to decouple this into a separate issue, it seems too directly related, so I add to this comment:
In this change of aligning random parameters to numpy behavior formerly throwing tests are removed. However, no corresponding tests of the – now possible – good behavior are introduced.
Describe the solution you'd like
Currently the new expected behavior is not tested. Arguments that other statistical tests cover this implicitly are invalid, as
- the other tests did not run into
std==0cases before (otherwise they would havethrown, and would now not throw anymore). - any
catchbehavior outside the changed functionality that defines values may already include branches forstd==0case, causing the new functionality not to be used.
Additionally the std=0 parameter would explicit test that the exact mean value is returned
good_random_parameters = [
…,
("normal", {"mean": 0.0, "std": 0.0}, 0.0),
("normal", {"mean": 1.0, "std": 0.0}, 1.0),
]
Testing this for different mean values makes sure
- the new functionality does actually work, and ensures that the code does not bypass calculations returning default values.
- returned values are exact and internal calculations don't cause float bit-differences, i.e.
mean = 0.1→result == 0.1without rounding errors.
There was a problem hiding this comment.
I have now added tests to check correct behavior for std==0.
| ("lognormal", {"mu": 0.0, "std": -1.0}), | ||
| ("lognormal", {"mu": 0.0, "std": 1.0}), | ||
| ("exponential", {"beta": -1.0}), | ||
| ] |
There was a problem hiding this comment.
After considering to decouple this into a separate issue, it seems too directly related, so I add to this comment:
In this change of aligning random parameters to numpy behavior formerly throwing tests are removed. However, no corresponding tests of the – now possible – good behavior are introduced.
Describe the solution you'd like
Currently the new expected behavior is not tested. Arguments that other statistical tests cover this implicitly are invalid, as
- the other tests did not run into
std==0cases before (otherwise they would havethrown, and would now not throw anymore). - any
catchbehavior outside the changed functionality that defines values may already include branches forstd==0case, causing the new functionality not to be used.
Additionally the std=0 parameter would explicit test that the exact mean value is returned
good_random_parameters = [
…,
("normal", {"mean": 0.0, "std": 0.0}, 0.0),
("normal", {"mean": 1.0, "std": 0.0}, 1.0),
]
Testing this for different mean values makes sure
- the new functionality does actually work, and ensures that the code does not bypass calculations returning default values.
- returned values are exact and internal calculations don't cause float bit-differences, i.e.
mean = 0.1→result == 0.1without rounding errors.
|
Here we have the curious situation that the test for std=0 crashes the testsuite completely under Linux while it passes under macOS. I will investigate this further. It does not seem to be caused by differences in the libc++ (Clang) and libstdc++ (gcc) implementation of the normal distribution. |
|
The problem persists with Ubuntu 24.04 on the runner, gcc 13.3, Python 3.12. |
|
OK, mystery understood: It is not a bug, it is a feature. The C++ standard requires in its §26.5.8.4.4 that the So from a strict interpretation of the C++ standard, we must have I assume that the logic of the C++ standard designers was that for This means that we cannot permit "sigma==0" for normal and lognormal parameters in a straightforward way. So the question is, @tomtetzlaff, how important is it for us to allow zero here just to allow what NumPy allows? |
|
I have now implemented a slightly brute-force way of handling sigma=0 — after all, we don't need a random distribution if we know the result. |
This PR aligns the non-spatial normal, lognormal and exponential parameters in NEST by allowing
std == 0orbeta == 0, respectively. For the exponential parameter, a test against negative beta is introduced.This PR only modifies the plain random parameters. For spatial random parameters, we still require that parameters are strictly positive.
This fixes #3736.