Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,9 @@ dependencies:
- sacremoses

# Warning: jiant currently depends on *both* pytorch_pretrained_bert > 0.6 _and_
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check if this is still true—is the old package still a dependency via Allen? If not, delete.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

allennlp v0.8.4 still depends on pytorch-pretrained-bert, and we've not changed the allennlp requirement.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dang. I guess it's not worth the effort to update the Allen dependency, assuming that the 2.0 migration is coming up fairly soon.

# transformers > 2.3.0. These are the same package, though the name changed between
# transformers > 2.6.0. These are the same package, though the name changed between
# these two versions. AllenNLP requires 0.6 to support the BertAdam optimizer, and jiant
# directly requires 1.0 to support XLNet and WWM-BERT.
# This AllenNLP issue is relevant: https://github.com/allenai/allennlp/issues/3067
- transformers==2.3.0
- transformers==2.6.0
- tokenizers==0.5.2
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,8 @@
"pyhocon==0.3.35",
"python-Levenshtein==0.12.0",
"sacremoses",
"transformers==2.3.0",
"transformers==2.6.0",
"tokenizers==0.5.2",
"ftfy",
"spacy",
],
Expand Down