-
-
Notifications
You must be signed in to change notification settings - Fork 11.7k
Upgrade transformers to v4.50.3
#13905
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Harry Mellor <[email protected]>
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
|
This pull request has merge conflicts that must be resolved before it can be |
Signed-off-by: Harry Mellor <[email protected]>
|
Has this been resolved? We want to upgrade to v4.50 for Gemma3 release. |
|
Not yet, we're expecting a release sometime next week |
|
Can we also bump lm-format-enforcer version for v0? |
|
V1 and V0 do not have different requirements. |
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
transformers to v4.49.0transformers to v4.50.0
|
Oh, I thought it wasn't bumped. please discard that message my bad 😃 |
|
As mentioned before, let's just skip those tests where the HF repo can't keep up with the latest transformers version |
|
What is the preferred method of skipping?
|
|
Let's add a new field (The tests for |
Ok I can make this change and use it for models which are not currently compatible with vLLM because they're outdated.
For these ones it's only the |
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
Yeah they do work with vLLM. I think the other models would also work with vLLM if not for the outdated imports... |
|
Yeah the outdated imports are the only blocker |
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
|
The two further issues I found have been added to the description. Fix PRs have already been merged in transformers, we'd just need another patch release. |
transformers to v4.50.2transformers to v4.50.3
|
This pull request has merge conflicts that must be resolved before it can be |
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]> Signed-off-by: xinyuxiao <[email protected]>
Signed-off-by: Harry Mellor <[email protected]> Signed-off-by: Louis Ulmer <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]>
Signed-off-by: Harry Mellor <[email protected]> Signed-off-by: Mu Huai <[email protected]>
Supersedes #13602
Depends on:
LogitsWarperwithLogitsProcessornoamgat/lm-format-enforcer#159src/transformers/image_utils.pyhuggingface/transformers#36435Models which are not compatible with the latest version of Transformers: