Skip to content

Conversation

@paulyu12
Copy link
Collaborator

@paulyu12 paulyu12 commented May 30, 2025

What this PR does / why we need it?

Because the EOL vLLM-v0.7.3 lacks this PR(vllm-project/vllm#13166), while launching Qwen3+LoRA on vllm-ascend0.7.3, the error "Qwen3ForCausalLM" object has no attribute "embedding modules" will be raised.

We modify qwen3.py to support Qwen3+LoRA on vllm-ascend v0.7.3 instead.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

@paulyu12 paulyu12 changed the title [V0.7.3][Qwen3] Make v0.7.3 support qwen3 [V0.7.3][Qwen3] Make v0.7.3 support Qwen3+LoRA May 30, 2025
@paulyu12 paulyu12 changed the title [V0.7.3][Qwen3] Make v0.7.3 support Qwen3+LoRA [V0.7.3][LoRA][Qwen3] Make v0.7.3 support Qwen3+LoRA May 30, 2025
paulyu added 3 commits May 30, 2025 23:33
Signed-off-by: paulyu <[email protected]>
Signed-off-by: paulyu <[email protected]>
Signed-off-by: paulyu <[email protected]>
@wangxiyuan
Copy link
Collaborator

Thanks for the update! We can merge this first. While we don't have the detail plan for post2 release. Maybe it needs more time.

@wangxiyuan wangxiyuan merged commit b69d41d into vllm-project:v0.7.3-dev Jun 3, 2025
11 checks passed
@paulyu12 paulyu12 deleted the v0.7.3-dev branch October 27, 2025 02:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants