This repository was archived by the owner on Sep 4, 2025. It is now read-only.
forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 15
add ubi dockerfile #27
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
z103cb
approved these changes
May 21, 2024
This comment was marked as outdated.
This comment was marked as outdated.
dtrifiro
pushed a commit
that referenced
this pull request
May 23, 2024
This fixes a miss where I had seen usages of `.labels` `**`a dictionary
into kwargs, and I accidentally passed a raw dictionary as a value
instead of using keyword arguments 🤦. This caused metrics to show eg.
`method="{'method':'prefill'}"` instead of `method=prefill`
Signed-off-by: Joe Runde <[email protected]>
This comment was marked as outdated.
This comment was marked as outdated.
091b95d to
b2cc74a
Compare
z103cb
approved these changes
May 31, 2024
z103cb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
/approve
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: dtrifiro, z103cb The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
|
/retest |
b2cc74a to
ab34377
Compare
|
New changes are detected. LGTM label has been removed. |
Author
|
/retest-required |
Xaenalt
pushed a commit
that referenced
this pull request
Sep 18, 2024
Xaenalt
pushed a commit
that referenced
this pull request
Sep 18, 2024
* Bucketing/Warmup WIP * Cleanup * Revert "Fix model_output_idx on HPU (#27)" This reverts commit 90dfa92. * Rework selected_token_indices fix to also work with block_size padding * Simple prompt attention POC * Remove cumsum * MQA/GQA support for simple prompt_attention * Cleanup * Fix typo * Restore profiling runs
prarit
pushed a commit
to prarit/vllm
that referenced
this pull request
Oct 18, 2024
* [Kernel] Enable custome AR on ROCm * Install amdsmi in Docker in preparation for custom all reduce (cherry picked from commit f6cfb9bf31e9feeefbdedecf2165f80dd0564b75) * Fix for yapf * Linting and small fixes to vLLM syntax (cherry picked from commit 2cf8103bfb0afce59b28a06c5bbe905983c42728) --------- Co-authored-by: Matthew Wong <[email protected]>
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.