Skip to content

Updating to Transformers v2.6.0#1059

Merged
zphang merged 4 commits intomasterfrom
transformers_260
Apr 22, 2020
Merged

Updating to Transformers v2.6.0#1059
zphang merged 4 commits intomasterfrom
transformers_260

Conversation

@zphang
Copy link
Collaborator

@zphang zphang commented Apr 10, 2020

Performance comparison on a set of representative tasks.

Task albert-xxlarge-v2 (#990) roberta-large (#990) albert-xxlarge-v2 (v2.6.0) roberta-large (v2.6.0)
wsc 0.76 0.692 0.769 0.692
rte 0.888 0.866 0.892 0.848
wic 0.74 0.724 0.741 0.721
sst 0.955 0.961 0.956 0.956
cosmosqa 0.847 0.814 0.846 0.822
qa-srl 0.641 0.664 0.641 0.666

@zphang zphang force-pushed the transformers_260 branch from 71a2933 to e560108 Compare April 10, 2020 21:42
@zphang zphang mentioned this pull request Apr 10, 2020
@@ -39,8 +39,9 @@ dependencies:
- sacremoses

# Warning: jiant currently depends on *both* pytorch_pretrained_bert > 0.6 _and_
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check if this is still true—is the old package still a dependency via Allen? If not, delete.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

allennlp v0.8.4 still depends on pytorch-pretrained-bert, and we've not changed the allennlp requirement.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dang. I guess it's not worth the effort to update the Allen dependency, assuming that the 2.0 migration is coming up fairly soon.

@sleepinyourhat
Copy link
Contributor

Looks good so far! Thanks!

@sleepinyourhat
Copy link
Contributor

Since you asked, this still looks good to me. If the testing infrastructure is all ready to go, though, it couldn't hurt to kick off tests with 2.8.0 now, too.

@zphang
Copy link
Collaborator Author

zphang commented Apr 19, 2020

Updated the full table. I think we should be good to merge.

Given the upcoming deadlines, I recommend waiting till after EMNLP to update transformers again.

Copy link
Contributor

@pyeres pyeres left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @zphang — can you say a few words in the description about what in addition to the version bump is being done here (e.g., XLMRobertaTokenizer changes). This will help me make sure I understand the PR, and it'll help inform the next set of release notes.

@sleepinyourhat
Copy link
Contributor

@zphang Why delay the merge? If it we've vetted it to our usual degree, than we should get some additional speedups/options out of this PR.

Of course, it's not great to make major changes after a round of experiments has already started, but the solution to that would just be to maintain a separate branch for each major experiment, which is a good idea in any case.

@zphang zphang force-pushed the transformers_260 branch from dea9fc4 to 72b05d8 Compare April 20, 2020 18:36
@zphang
Copy link
Collaborator Author

zphang commented Apr 20, 2020

@pyeres I've removed the commit concerting the XLMRoBERTaTokenizer. This PR should only update the requirements (transformers, and tokenizers).

@sleepinyourhat To clarify, I support merging in the update to v2.6.0 now, and putting off the update to v2.8.0.

@zphang
Copy link
Collaborator Author

zphang commented Apr 20, 2020

Unrelated: test_dynamic_masking failed on the first pass, and passed on rerunning.

@zphang zphang changed the title [WIP] Updating to Transformers v2.6.0 Updating to Transformers v2.6.0 Apr 20, 2020
Copy link
Contributor

@pyeres pyeres left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for running the validations @zphang — looks good.

@zphang zphang merged commit dae9dc6 into master Apr 22, 2020
@zphang zphang deleted the transformers_260 branch April 22, 2020 02:33
@pyeres pyeres mentioned this pull request Apr 28, 2020
@jeswan jeswan added the jiant-v1-legacy Relevant to versions <= v1.3.2 label Sep 17, 2020
leo-liuzy pushed a commit to leo-liuzy/dynamic_jiant that referenced this pull request Nov 11, 2020
* Transformers v2.6.0

* requirements update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

jiant-v1-legacy Relevant to versions <= v1.3.2

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants