-
-
Notifications
You must be signed in to change notification settings - Fork 11.6k
[Bugfix] Modify modelscope api usage in transformer_utils #13807
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Shanshan Shen <[email protected]>
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
|
For those who might care: Please update your modelscope with |
### What this PR does / why we need it? Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve: vllm-project/vllm#13807 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed Signed-off-by: Yikun Jiang <[email protected]>
### What this PR does / why we need it? Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve: vllm-project/vllm#13807 Backport: #272 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed Signed-off-by: Yikun Jiang <[email protected]>
### What this PR does / why we need it? Pin modelscope<1.23.0 match vLLM 0.7.3 vllm-project/vllm#13807 is not included in v0.7.3, thus we just pin it to branch v0.7.3-dev ### Does this PR introduce _any_ user-facing change? N/A ### How was this patch tested? CI passed with existing test. Signed-off-by: MengqingCao <[email protected]>
### What this PR does / why we need it? Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve: vllm-project/vllm#13807 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed Signed-off-by: Yikun Jiang <[email protected]> Signed-off-by: angazenn <[email protected]>
### What this PR does / why we need it? Add support for V1 Engine on v0.7.3. ### Does this PR introduce _any_ user-facing change? Find more details at #295. Plus, due to a bug of `vllm v0.7.3` when using modelscope, you should use a lower version of `modelscope`. Find more details at vllm-project/vllm#13807. This can work: ```bash pip install modelscope==1.21.1 ``` ### How was this patch tested? Find more details at #295. Signed-off-by: shen-shanshan <[email protected]> Co-authored-by: didongli182 <[email protected]>
|
This issue resolved since v0.8.0, for vLLM v0.7.3: |
…ct#13807) Signed-off-by: Louis Ulmer <[email protected]>
### What this PR does / why we need it? Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve: vllm-project/vllm#13807 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed Signed-off-by: Yikun Jiang <[email protected]>
### What this PR does / why we need it? Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve: vllm-project/vllm#13807 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? CI passed Signed-off-by: Yikun Jiang <[email protected]>
What this PR does?
Modify modelscope api usage in
transformer_utils.Find more details at this issue in modelscope.
This
_try_login()has been removed, now you just need code below:If token is None, this method will not throw, if token is not correct, this method will throw an error.
This is easier to use.