Skip to content

Conversation

@MengqingCao
Copy link
Collaborator

@MengqingCao MengqingCao commented Mar 11, 2025

What this PR does / why we need it?

Pin modelscope<1.23.0 match vLLM 0.7.3

vllm-project/vllm#13807 is not included in v0.7.3, thus we just pin it to branch v0.7.3-dev

Does this PR introduce any user-facing change?

N/A

How was this patch tested?

CI passed with existing test.

@wangxiyuan
Copy link
Collaborator

with this change 5804c6c, the error still exist?

@MengqingCao
Copy link
Collaborator Author

with this change 5804c6c, the error still exist?

Yes, this is a part of error log in https://github.com/vllm-project/vllm-ascend/actions/runs/13781939634/job/38541606410?pr=292

  File "/usr/local/python3.10/lib/python3.10/site-packages/vllm/transformers_utils/config.py", line 130, in lookup_files
    return modelscope_list_repo_files(repo_id,
  File "/usr/local/python3.10/lib/python3.10/site-packages/vllm/transformers_utils/utils.py", line 32, in modelscope_list_repo_files
    from modelscope.utils.hf_util import _try_login
ImportError: cannot import name '_try_login' from 'modelscope.utils.hf_util'

@wangxiyuan
Copy link
Collaborator

wangxiyuan commented Mar 11, 2025

so why not pin the modelscope version in CI as well?
https://github.com/vllm-project/vllm-ascend/blob/v0.7.3-dev/requirements-dev.txt#L2

@MengqingCao MengqingCao changed the title [CI] Patch modelscope list repo files to fix CI [CI][v0.7.3] Pin modelscope<1.23.0 match vLLM 0.7.3 Mar 11, 2025
@MengqingCao
Copy link
Collaborator Author

so why not pin the modelscope version in CI as well? https://github.com/vllm-project/vllm-ascend/blob/v0.7.3-dev/requirements-dev.txt#L2

Got it, done

@wangxiyuan wangxiyuan merged commit f8dd4f5 into vllm-project:v0.7.3-dev Mar 11, 2025
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants