Skip to content

Conversation

@csahithi
Copy link
Contributor

@csahithi csahithi commented Sep 2, 2025

Purpose

Publish nightly builds to dockerhub and retain builds only for the last 14 days

cc: @kushanam @sgodithi1

Test Plan

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

@mergify mergify bot added the ci/build label Sep 2, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new Buildkite step for publishing nightly multi-arch images to DockerHub, along with a script for cleaning up older images. The overall approach is good, but I've identified a critical issue in the cleanup script. The script incorrectly sorts tags alphabetically instead of chronologically, which could lead to the deletion of recent builds. My review provides a correction to ensure tags are sorted by their update timestamp, thus ensuring the oldest builds are removed as intended.

Copy link

@nWEIdia nWEIdia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

Nit:
The manifest publishing workflow seems to be a bit duplicative of
https://github.com/vllm-project/vllm/blob/e0653f6c0b9f331af0877e7c7abc99a85efc3982/.buildkite/scripts/annotate-release.sh

Existing workflow may already support the publish to vllm/vllm-openai:latest
See: https://github.com/vllm-project/vllm/blob/e0653f6c0b9f331af0877e7c7abc99a85efc3982/.buildkite/scripts/annotate-release.sh#L26C1-L27C65

Is it possible to re-use annotate release script?

@csahithi
Copy link
Contributor Author

csahithi commented Sep 2, 2025

@nWEIdia The annotate release script is being used to publish specific versions of x86 builds to dockerhub as you can see here: https://hub.docker.com/r/vllm/vllm-openai/tags and seems like this is not scheduled. In this PR, since we want to see two weeks worth of nightly builds on dockerhub, I'm directly fetching the multiarch manifest from AWS and publishing it to dockerhub as nightly. Pls let me know if you have comments on this.

@nWEIdia
Copy link

nWEIdia commented Sep 2, 2025

docker pull public.ecr.aws/q9t5s3a7/vllm-release-repo:${BUILDKITE_COMMIT}
docker tag public.ecr.aws/q9t5s3a7/vllm-release-repo:${BUILDKITE_COMMIT} vllm/vllm-openai
docker tag vllm/vllm-openai vllm/vllm-openai:latest
docker tag vllm/vllm-openai vllm/vllm-openai:v${RELEASE_VERSION}
The above in annotate release should already support pushing ARM container to vllm/vllm-openai.

I like the newly added functionality of nightly tag and only keeping 14 commits, but I just wasn't sure about the fact that this PR would introduce a 2nd write-interface to the vllm/vllm-openai , with delete.

e.g. If an "official" release has happened, and a commit tagged, how do you handle the 14 commits deletion? Would your workflow delete the official commit release?

@nWEIdia
Copy link

nWEIdia commented Sep 2, 2025

cc @simon-mo who might have already visioned how the vllm/vllm-openai nightly tags would like and how we are going to maintain nightly docker images and their tags.

@csahithi
Copy link
Contributor Author

csahithi commented Sep 2, 2025

I agree, defer to @simon-mo for more comments on this

@nWEIdia
Copy link

nWEIdia commented Sep 5, 2025

Hi @simon-mo @youkaichao Is there anything remaining that you want us to check before merging this PR?

@mergify
Copy link

mergify bot commented Sep 8, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @csahithi.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Sep 8, 2025
@simon-mo
Copy link
Collaborator

simon-mo commented Sep 8, 2025

Let me create a nightly schedule for the pipeline and test it out

@mergify mergify bot removed the needs-rebase label Sep 8, 2025
Signed-off-by: simon-mo <[email protected]>
@simon-mo
Copy link
Collaborator

simon-mo commented Sep 8, 2025

@simon-mo simon-mo enabled auto-merge (squash) September 8, 2025 23:44
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 8, 2025
@simon-mo simon-mo merged commit 6910b56 into vllm-project:main Sep 9, 2025
19 of 21 checks passed
@nWEIdia
Copy link

nWEIdia commented Sep 9, 2025

FYI, cross link the nightly release job: https://buildkite.com/vllm/release/builds/8011/steps/canvas

@nWEIdia
Copy link

nWEIdia commented Sep 9, 2025

Thanks @simon-mo for merging! We need additional help as it looks like we did not configure docker vllm/vllm-openai authentication. Has the push to vllm/vllm-openai always been done manually or there is CI setup with token authentication to do that?

eicherseiji pushed a commit to eicherseiji/vllm that referenced this pull request Sep 9, 2025
Signed-off-by: Sahithi Chigurupati <[email protected]>
Signed-off-by: Simon Mo <[email protected]>
Signed-off-by: simon-mo <[email protected]>
Co-authored-by: Simon Mo <[email protected]>
skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
Signed-off-by: Sahithi Chigurupati <[email protected]>
Signed-off-by: Simon Mo <[email protected]>
Signed-off-by: simon-mo <[email protected]>
Co-authored-by: Simon Mo <[email protected]>
@csahithi csahithi changed the title [CI] Add nightly multiarch manifests to dockerhub [CI] Add nightly builds to dockerhub Sep 16, 2025
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Sahithi Chigurupati <[email protected]>
Signed-off-by: Simon Mo <[email protected]>
Signed-off-by: simon-mo <[email protected]>
Co-authored-by: Simon Mo <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
Signed-off-by: Sahithi Chigurupati <[email protected]>
Signed-off-by: Simon Mo <[email protected]>
Signed-off-by: simon-mo <[email protected]>
Co-authored-by: Simon Mo <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Sahithi Chigurupati <[email protected]>
Signed-off-by: Simon Mo <[email protected]>
Signed-off-by: simon-mo <[email protected]>
Co-authored-by: Simon Mo <[email protected]>
Signed-off-by: xuebwang-amd <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants