Skip to content

Conversation

@scydas
Copy link
Contributor

@scydas scydas commented Nov 27, 2025

Update CPU PyTorch to 2.9.0

Purpose

Update CPU PyTorch to 2.9.0

Resolve #29579

Test Plan

CI

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors.

You ask your reviewers to trigger select CI tests on top of fastcheck CI.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

If you have any questions, please reach out to us on Slack at https://slack.vllm.ai.

🚀

@mergify mergify bot added the ci/build label Nov 27, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the PyTorch version to 2.9.0 for CPU builds in the project's requirements files. While the core change is correct, it seems some related dependencies were missed. Specifically, torchaudio and torchvision for the ppc64le platform in requirements/cpu.txt need to be updated to versions compatible with PyTorch 2.9.0. Furthermore, I've noticed that some Dockerfiles (e.g., docker/Dockerfile.cpu, docker/Dockerfile.s390x) contain hardcoded references to older PyTorch versions. These should also be updated to ensure consistency across all build and test environments. Addressing these points will help prevent potential dependency conflicts and build failures.

torch==2.9.0+cpu; platform_machine == "x86_64" or platform_machine == "s390x"
torch==2.9.0; platform_system == "Darwin"
torch==2.8.0; platform_machine == "ppc64le" or platform_machine == "aarch64"
torch==2.9.0; platform_machine == "ppc64le" or platform_machine == "aarch64"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

While you've updated the torch version for ppc64le to 2.9.0, the corresponding torchaudio and torchvision versions for this platform are still pinned to older versions on lines 16 and 20, respectively. To maintain compatibility with PyTorch 2.9.0, these should be updated. Based on the cuda requirements and PyTorch release information, torchaudio should be version 2.9.0 and torchvision should be 0.24.0.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +12 to 15
torch==2.9.0; platform_machine == "ppc64le" or platform_machine == "aarch64"

# required for the image processor of minicpm-o-2_6, this must be updated alongside torch
torchaudio; platform_machine != "ppc64le" and platform_machine != "s390x"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Align torchaudio pin with torch bump on ppc64le

On ppc64le we now pin torch==2.9.0, but torchaudio==2.8.0 remains unchanged for the same architecture. Torchaudio 2.8 requires torch 2.8.*, so pip will refuse to resolve this requirements file on ppc64le after the torch upgrade. Please bump the ppc64le torchaudio pin (or keep torch at 2.8) to restore a consistent, installable set.

Useful? React with 👍 / 👎.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, I think we should address this

@scydas scydas force-pushed the update-cpu-pytorch branch 4 times, most recently from 19aae0f to ec27b5b Compare November 27, 2025 07:51
@scydas scydas force-pushed the update-cpu-pytorch branch from ec27b5b to ea8d4c2 Compare November 27, 2025 08:16
@scydas scydas changed the title Update CPU PyTorch to 2.9.0 [CPU]Update CPU PyTorch to 2.9.0 Nov 27, 2025
@bigPYJ1151
Copy link
Member

Please also remove the version guard in test image:

sed -i 's/^torch==.*/torch==2.8.0/g' requirements/cpu-test.in && \

@scydas scydas requested a review from bigPYJ1151 as a code owner November 27, 2025 09:18
@scydas
Copy link
Contributor Author

scydas commented Nov 27, 2025

bigPYJ1151

thanks, I have removed the version guard in test image

Copy link
Contributor

@hickeyma hickeyma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should it be updated to the latest version 2.9.1?

@bigPYJ1151
Copy link
Member

https://buildkite.com/vllm/fastcheck/builds/44879#019ac51f-0ec7-47a2-b491-d75e4f786cda
Test passed.
For 2.9.1, I prefer to update it when other backends ready.

@bigPYJ1151 bigPYJ1151 enabled auto-merge (squash) November 27, 2025 14:13
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Nov 27, 2025
torch==2.9.0+cpu; platform_machine == "x86_64" or platform_machine == "s390x"
torch==2.9.0; platform_system == "Darwin"
torch==2.8.0; platform_machine == "ppc64le" or platform_machine == "aarch64"
torch==2.9.0; platform_machine == "ppc64le" or platform_machine == "aarch64"
Copy link
Contributor

@fadara01 fadara01 Nov 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we hold-off until we check that torch 2.9 works smoothly on Arm?
I'm hoping to get that checked today

If you need that merged now, please keep the 2.8.0 torch version for AArch64 and I'll update that later.

setuptools-scm>=8
--extra-index-url https://download.pytorch.org/whl/cpu
torch==2.8.0+cpu; platform_machine == "x86_64" or platform_machine == "s390x"
torch==2.9.0+cpu; platform_machine == "x86_64" or platform_machine == "s390x"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason why we update to 2.9.0 and not 2.9.1?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahhh I see @bigPYJ1151 message above saying that other backends are not ready.
Do we know what the issue with 2.9.1 is?

@fadara01
Copy link
Contributor

Arm CI passes with torch 2.9 - we're happy with this change: https://buildkite.com/vllm/ci/builds/40955/steps/canvas?sid=019ac660-dd86-4b9d-b5e9-62dc15691725

@bigPYJ1151 bigPYJ1151 merged commit 35657bc into vllm-project:main Nov 28, 2025
15 checks passed
@DarkLight1337
Copy link
Member

DarkLight1337 added a commit to DarkLight1337/vllm that referenced this pull request Nov 28, 2025
@DarkLight1337
Copy link
Member

Reverting in #29647

@DarkLight1337
Copy link
Member

Please open a new version of this PR that passes the test.

vllm-bot pushed a commit that referenced this pull request Nov 28, 2025
kitaekatt pushed a commit to kitaekatt/vllm that referenced this pull request Dec 1, 2025
kitaekatt pushed a commit to kitaekatt/vllm that referenced this pull request Dec 1, 2025
amd-hhashemi pushed a commit to amd-hhashemi/vllm that referenced this pull request Dec 2, 2025
Signed-off-by: scyda <[email protected]>
Co-authored-by: Li, Jiang <[email protected]>
Signed-off-by: Hashem Hashemi <[email protected]>
amd-hhashemi pushed a commit to amd-hhashemi/vllm that referenced this pull request Dec 2, 2025
charlotte12l pushed a commit to charlotte12l/vllm that referenced this pull request Dec 5, 2025
Signed-off-by: scyda <[email protected]>
Co-authored-by: Li, Jiang <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
charlotte12l pushed a commit to charlotte12l/vllm that referenced this pull request Dec 5, 2025
Zhathw pushed a commit to Zhathw/vllm that referenced this pull request Dec 6, 2025
Zhathw pushed a commit to Zhathw/vllm that referenced this pull request Dec 6, 2025
charlotte12l pushed a commit to charlotte12l/vllm that referenced this pull request Dec 9, 2025
charlotte12l pushed a commit to charlotte12l/vllm that referenced this pull request Dec 9, 2025
Somoku pushed a commit to Somoku/vllm that referenced this pull request Dec 15, 2025
Signed-off-by: scyda <[email protected]>
Co-authored-by: Li, Jiang <[email protected]>
Signed-off-by: Somoku <[email protected]>
Somoku pushed a commit to Somoku/vllm that referenced this pull request Dec 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Update CPU backend torch to 2.9

5 participants