[chore]Bump vLLM Image Tags#1733
Conversation
✅ Deploy Preview for gateway-api-inference-extension ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
|
@Frapschen did you run the quickstart guide with these GPU and CPU versions to make sure it works? |
|
@nirrozenbaum I can confirm that the CPU one works fine with me. vllm-llama3-8b-instruct-cpu-7555494db4-bvpd4 pod manifest: curl test: |
|
@Frapschen any update on the GPU image? |
|
/lgtm I think the vllm image is now quite old and we should upgrade. |
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: ahg-g, Frapschen The full list of commands accepted by this bot can be found here. The pull request process is described here DetailsNeeds approval from an approver in each of these files:
Approvers can indicate their approval by writing |
What type of PR is this?
/kind documentation
What this PR does / why we need it:
I try to bump the two image versions:
Which issue(s) this PR fixes:
Fixes #1722
Does this PR introduce a user-facing change?: