Skip to content

Conversation

@pmeier
Copy link
Contributor

@pmeier pmeier commented Jan 12, 2023

Fixes #7079.

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Philip.
Before merging, could we add some quick test for the endianness conversion (possibly requiring extracting it in a separate function)?

Something along these lines:

t = torch.tensor([0x00112233, 0xaabbccdd], dtype=float.torch32)
flipped = torch.tensor([0x33221100, 0xddccbbaa], dtype=float.torch32)
assert flip_endianness(t) == flipped

Ideally we'd test the QMnist dataset directly, but this may make the tests just as complicated as the code (which kinds defeats the purpose of the test)

@pmeier pmeier requested a review from NicolasHug January 12, 2023 12:31
Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @pmeier , some comments but LGTM

@pmeier pmeier merged commit 934ce3b into pytorch:main Jan 12, 2023
@pmeier pmeier deleted the fix-qmnist branch January 12, 2023 16:25
facebook-github-bot pushed a commit that referenced this pull request Jan 13, 2023
Summary:
* fix MNIST byte flipping

* add test

* move to utils

* remove lazy import

Reviewed By: YosuaMichael

Differential Revision: D42500904

fbshipit-source-id: 067064facc22efc68368d06dccd4fa2e2fa6dfc1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

The labels in the QMNIST dataset are all equal to 0.

3 participants