Skip to content

Conversation

@Potabk
Copy link
Collaborator

@Potabk Potabk commented Nov 28, 2025

What this PR does / why we need it?

  1. fix Update rope_scaling to rope_parameters in preparation for Transformers v5 vllm#28542
    The model structure modifications we involved in are:

    • Qwen2.5-VL(still exist some patch)
    • Qwen2-VL
    • Qwen2
    • DeepSeek series
    • Qwen-moe series
  2. fix Revert "[Redo] #26368 (#28771)" vllm#29121
    the output token now type changed from np to list[list[int]]

  3. fix [Core] Deprecate xformers vllm#29262
    xformers backend for multimodal now has been deprecated

  4. fix [Attention] Remove imports from vllm/attention/__init__.py vllm#29342

  5. fix [Core] Refactor padding logic and pad for CUDA graphs before attention metadata building  vllm#28579

  6. fix [Feature] Prefill Context Parallel (PCP) basic support vllm#28718

  7. fix [Config] Clean up SchedulerConfig initialization vllm#28665

  8. fix [Frontend][torch.compile] CompilationConfig Overhaul (#20283): Set up -O infrastructure vllm#26847
    vllm introduced the optimization-level, some default config has been changed, and the param --enforce-eager has been deprecated

Does this PR introduce any user-facing change?

How was this patch tested?

@gemini-code-assist
Copy link
Contributor

Note

Gemini is unable to generate a review for this pull request due to the file types involved not being currently supported.

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@Potabk Potabk added ready read for review ready-for-test start test by label for PR labels Nov 28, 2025
@Potabk Potabk changed the title [Main] Upgrade vllm commit to 2025_11_20 [Main] Upgrade vllm commit to 2025_11_25 Nov 29, 2025
@Potabk Potabk marked this pull request as draft November 29, 2025 06:50
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
@Potabk Potabk changed the title [Main] Upgrade vllm commit to 2025_11_25 [Main] Upgrade vllm commit to 2025_12_01 Dec 1, 2025
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
@github-actions
Copy link

github-actions bot commented Dec 1, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
@wangxiyuan
Copy link
Collaborator

replace by #4608

@wangxiyuan wangxiyuan closed this Dec 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants