Skip to content

Conversation

@shen-shanshan
Copy link
Contributor

@shen-shanshan shen-shanshan commented Feb 25, 2025

What this PR does?

Modify modelscope api usage in transformer_utils.

Find more details at this issue in modelscope.

This _try_login() has been removed, now you just need code below:

from modelscope.hub.api import HubApi
api = HubApi()
api.login(token)

If token is None, this method will not throw, if token is not correct, this method will throw an error.

This is easier to use.

Signed-off-by: Shanshan Shen <[email protected]>
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@Isotr0py Isotr0py enabled auto-merge (squash) February 25, 2025 07:07
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 25, 2025
@simon-mo simon-mo merged commit 2d87d7d into vllm-project:main Feb 25, 2025
37 of 42 checks passed
@ShangmingCai
Copy link
Contributor

For those who might care:
This PR only works for the latest version of modelscope.
If you are encountering an error like:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/root/miniconda3/envs/test1/lib/python3.10/site-packages/modelscope/hub/api.py", line 133, in login
    raise_for_http_status(r)
  File "/root/miniconda3/envs/test1/lib/python3.10/site-packages/modelscope/hub/errors.py", line 205, in raise_for_http_status
    raise HTTPError(http_error_msg, response=rsp)
requests.exceptions.HTTPError: 400 Client Error: Bad Request, Request id: 0c4264741c954c5da8b46a559b9ac5a5 for url: https://www.modelscope.cn/api/v1/login, body: b'{"AccessToken": null}'

Please update your modelscope with pip3 install modelscope -U to solve this problem.

wangxiyuan pushed a commit to vllm-project/vllm-ascend that referenced this pull request Mar 9, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve:
vllm-project/vllm#13807

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
wangxiyuan pushed a commit to vllm-project/vllm-ascend that referenced this pull request Mar 9, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve:
vllm-project/vllm#13807

Backport: #272

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
wangxiyuan pushed a commit to vllm-project/vllm-ascend that referenced this pull request Mar 11, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 match vLLM 0.7.3

vllm-project/vllm#13807 is not included in
v0.7.3, thus we just pin it to branch v0.7.3-dev

### Does this PR introduce _any_ user-facing change?
N/A

### How was this patch tested?
CI passed with existing test.

Signed-off-by: MengqingCao <[email protected]>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Mar 18, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve:
vllm-project/vllm#13807

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
Signed-off-by: angazenn <[email protected]>
wangxiyuan pushed a commit to vllm-project/vllm-ascend that referenced this pull request Mar 26, 2025
### What this PR does / why we need it?

Add support for V1 Engine on v0.7.3.

### Does this PR introduce _any_ user-facing change?

Find more details at
#295.

Plus, due to a bug of `vllm v0.7.3` when using modelscope, you should
use a lower version of `modelscope`. Find more details at
vllm-project/vllm#13807.

This can work:

```bash
pip install modelscope==1.21.1
```

### How was this patch tested?

Find more details at
#295.

Signed-off-by: shen-shanshan <[email protected]>
Co-authored-by: didongli182 <[email protected]>
@Yikun
Copy link
Collaborator

Yikun commented Apr 2, 2025

This issue resolved since v0.8.0, for vLLM v0.7.3:pip install "modelscope<1.23.0"

lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
ttanzhiqiang pushed a commit to ttanzhiqiang/vllm-ascend that referenced this pull request Apr 27, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve:
vllm-project/vllm#13807

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
### What this PR does / why we need it?
Pin modelscope<1.23.0 on vLLM v0.7.3 to resolve:
vllm-project/vllm#13807

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI passed

Signed-off-by: Yikun Jiang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants