Skip to content

Conversation

@AK391
Copy link
Owner

@AK391 AK391 commented Jan 11, 2022

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

ydshieh and others added 28 commits January 7, 2022 20:02
* fix doc example - TypeError: get_text_features() got an unexpected keyword argument 'token_type_ids'

* add token_type_ids param

Co-authored-by: ydshieh <[email protected]>
* Fix convert for newer megatron-lm models

* Save megatron-bert config in a proper way

* Fix code style
* up

* up

* up

* up

* up

* up

* improve

* up

* up

* Update src/transformers/trainer.py

* up

* up

* up
* Make OpenAIGPTTokenizer work with SpaCy 3.x

SpaCy 3.x introduced an API change to creating the tokenizer that
breaks OpenAIGPTTokenizer. The old API for creating the tokenizer in
SpaCy 2.x no longer works under SpaCy 3.x, but the new API for creating
the tokenizer in SpaCy 3.x DOES work under SpaCy 2.x. Switching to the
new API should allow OpenAIGPTTokenizer to work under both SpaCy 2.x and
SpaCy 3.x versions.

* Add is_spacy_available and is_ftfy_available methods to file utils

* Add spacy and ftfy unittest decorator to testing utils

* Add tests for OpenAIGPTTokenizer that require spacy and ftfy

* Modify CircleCI config to run tests that require spacy and ftfy

* Remove unneeded unittest decorators are reuse test code

* Run make fixup
* support the trocr small models

* resolve conflict

* Update docs/source/model_doc/trocr.mdx

Co-authored-by: NielsRogge <[email protected]>

* Update docs/source/model_doc/trocr.mdx

Co-authored-by: NielsRogge <[email protected]>

* Update docs/source/model_doc/trocr.mdx

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/trocr/processing_trocr.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/trocr/processing_trocr.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/trocr/processing_trocr.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/trocr/processing_trocr.py

Co-authored-by: NielsRogge <[email protected]>

* fix unexpected indent in processing_trocr.py

* Update src/transformers/models/trocr/processing_trocr.py

Co-authored-by: NielsRogge <[email protected]>

* update the docstring of processing_trocr

* remove extra space

Co-authored-by: NielsRogge <[email protected]>
…bute 'from_question_encoder_generator_pretrained' (#15076)

Co-authored-by: ydshieh <[email protected]>
It's better for e.g. notebook.
* fix doc examples

* remove double colons
* [performance doc] Power and Cooling

* more docs

* Update docs/source/performance.mdx

Co-authored-by: Sylvain Gugger <[email protected]>

* reword

Co-authored-by: Sylvain Gugger <[email protected]>
* Start the work on TFVisionEncoderDecoderModel

* Expose TFVisionEncoderDecoderModel

* fix import

* Add modeling_tf_vision_encoder_decoder to _ignore_modules in get_model_modules()

* reorder

* Apply the fix for checkpoint loading as in #14016

* remove attention_mask + fix VISION_DUMMY_INPUTS

* A minimal change to make TF generate() work for vision models as encoder in encoder-decoder setting

* fix wrong condition: shape_list(input_ids) == 2

* add tests

* use personal TFViTModel checkpoint (for now)

* Add equivalence tests + projection layer

* style

* make sure projection layer can run

* Add examples

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <[email protected]>

* Clean comments (need to work on TODOs for PyTorch models)

* Remove TF -> PT in check_pt_tf_equivalence for TFVisionEncoderDecoderModel

* fixes

* Revert changes in PT code.

* Update tests/test_modeling_tf_vision_encoder_decoder.py

Co-authored-by: Patrick von Platen <[email protected]>

* Add test_inference_coco_en for TF test

* fix quality

* fix name

* build doc

* add main_input_name

* Fix ckpt name in test

* fix diff between master and this PR

* fix doc

* fix style and quality

* fix missing doc

* fix labels handling

* Delete auto.rst

* Add the changes done in #14016

* fix prefix

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <[email protected]>

* make style

Co-authored-by: ydshieh <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
* Add test

* Add tests for the reported train loss
* Take gradient accumulation into account when defining samplers

* style
* Add IBertOnnxConfig and tests

* add all the supported features for IBERT and remove outputs in IbertOnnxConfig

* use OnnxConfig

* fix codestyle

* remove serialization.rst

* codestyle
It solves the problem that metric_key_prefix is different from trainer.
* Initial commit

* Config and modelling changes

Added Nystromformer-specific attributes to config and removed all decoder functionality from modelling.

* Modelling and test changes

Added Nystrom approximation and removed decoder tests.

* Code quality fixes

* Modeling changes and conversion script

Initial commits to conversion script, modeling changes.

* Minor modeling changes and conversion script

* Modeling changes

* Correct modeling, add tests and documentation

* Code refactor

* Remove tokenizers

* Code refactor

* Update __init__.py

* Fix bugs

* Update src/transformers/__init__.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/__init__.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/__init__.py

Co-authored-by: NielsRogge <[email protected]>

* Update docs/source/model_doc/nystromformer.mdx

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/configuration_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/configuration_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/configuration_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/configuration_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/convert_nystromformer_original_pytorch_checkpoint_to_pytorch.py

Co-authored-by: NielsRogge <[email protected]>

* Update src/transformers/models/nystromformer/configuration_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update modeling and test_modeling

* Code refactor

* .rst to .mdx

* doc changes

* Doc changes

* Update modeling_nystromformer.py

* Doc changes

* Fix copies

* Apply suggestions from code review

Co-authored-by: NielsRogge <[email protected]>

* Apply suggestions from code review

Co-authored-by: NielsRogge <[email protected]>

* Update configuration_nystromformer.py

* Fix copies

* Update tests/test_modeling_nystromformer.py

Co-authored-by: NielsRogge <[email protected]>

* Update test_modeling_nystromformer.py

* Apply suggestions from code review

Co-authored-by: Lysandre Debut <[email protected]>

* Fix code style

* Update modeling_nystromformer.py

* Update modeling_nystromformer.py

* Fix code style

* Reformat modeling file

* Update modeling_nystromformer.py

* Modify NystromformerForMultipleChoice

* Fix code quality

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <[email protected]>

* Code style changes and torch.no_grad()

* make style

* Apply suggestions from code review

Co-authored-by: NielsRogge <[email protected]>
Co-authored-by: Lysandre Debut <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
@AK391 AK391 merged commit 5d7c433 into AK391:master Jan 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.