Skip to content

Conversation

@JetRunner
Copy link
Contributor

@JetRunner JetRunner commented Aug 14, 2020

This one solves it once for all. What do you think? @sgugger @LysandreJik

@codecov
Copy link

codecov bot commented Aug 14, 2020

Codecov Report

Merging #6475 into master will decrease coverage by 1.12%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6475      +/-   ##
==========================================
- Coverage   80.55%   79.42%   -1.13%     
==========================================
  Files         153      153              
  Lines       28001    28001              
==========================================
- Hits        22556    22241     -315     
- Misses       5445     5760     +315     
Impacted Files Coverage Δ
src/transformers/modeling_tf_mobilebert.py 24.55% <0.00%> (-72.36%) ⬇️
src/transformers/generation_tf_utils.py 84.21% <0.00%> (-2.26%) ⬇️
src/transformers/modeling_tf_utils.py 86.31% <0.00%> (-0.98%) ⬇️
src/transformers/pipelines.py 79.94% <0.00%> (+0.25%) ⬆️
src/transformers/modeling_tf_gpt2.py 95.01% <0.00%> (+23.16%) ⬆️
src/transformers/tokenization_albert.py 87.50% <0.00%> (+58.65%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 05810cd...646a7dc. Read the comment docs.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There were probably simpler approaches, but this is less error-prone when copy/pasting, LGTM! Thanks for diving into it!

@JetRunner JetRunner merged commit eb613b5 into master Aug 14, 2020
@JetRunner JetRunner deleted the hash_output_dir branch August 14, 2020 07:34
@sgugger
Copy link
Collaborator

sgugger commented Aug 14, 2020

Agreed, thanks for the fix!

sgugger added a commit that referenced this pull request Aug 14, 2020
* Generation doc

* MBartForConditionalGeneration (#6441)

* add MBartForConditionalGeneration

* style

* rebase and fixes

* add mbart test in TEST_FILES_WITH_NO_COMMON_TESTS

* fix docs

* don't ignore mbart

* doc

* fix mbart fairseq link

* put mbart before bart

* apply doc suggestions

* Use hash to clean the test dirs (#6475)

* Use hash to clean the test dirs

* Use hash to clean the test dirs

* Use hash to clean the test dirs

* fix

* [EncoderDecoder] Add Cross Attention for GPT2 (#6415)

* add cross attention layers for gpt2

* make gpt2 cross attention work

* finish bert2gpt2

* add explicit comments

* remove attention mask since not yet supported

* revert attn mask in pipeline

* Update src/transformers/modeling_gpt2.py

Co-authored-by: Sylvain Gugger <[email protected]>

* Update src/transformers/modeling_encoder_decoder.py

Co-authored-by: Sylvain Gugger <[email protected]>

Co-authored-by: Sylvain Gugger <[email protected]>

* Sort unique_no_split_tokens to make it deterministic (#6461)

* change unique_no_split_tokens's type to set

* use sorted list instead of set

* style

* Import accuracy_score (#6480)

* Apply suggestions from code review

Co-authored-by: Lysandre Debut <[email protected]>

* Address comments

* Styling

* Generation doc

* Apply suggestions from code review

Co-authored-by: Lysandre Debut <[email protected]>

* Address comments

* Styling

Co-authored-by: Suraj Patil <[email protected]>
Co-authored-by: Kevin Canwen Xu <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
Co-authored-by: Quentin Lhoest <[email protected]>
Co-authored-by: gijswijnholds <[email protected]>
Co-authored-by: Lysandre Debut <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants