Skip to content

Conversation

@stas00
Copy link
Contributor

@stas00 stas00 commented Dec 29, 2020

This PR is a follow up to #9338

According to #9338 (comment) we can just remove the torch.cuda.is_available() check before importing apex.normalizations.FusedLayerNorm and the multiprocess problem will go away.

Fixes #9338

@patrickvonplaten

@patrickvonplaten
Copy link
Contributor

Thanks a lot for digging into this @stas00

@patrickvonplaten patrickvonplaten merged commit ae333d0 into huggingface:master Dec 30, 2020
@stas00 stas00 deleted the fusedlayernorm branch December 30, 2020 18:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Multiprocessing CUDA issues when importing transformers

2 participants