Skip to content

Conversation

@mgoin
Copy link
Member

@mgoin mgoin commented Sep 22, 2025

Purpose

The "Quantization Test" has been broken for a few days due to FA2 being chosen for fp8 kv cache on SM80 and SM89 after this PR removed V0 fallback #25033

Screenshot 2025-09-22 at 1 23 20 PM

https://buildkite.com/vllm/ci/builds/31806/steps/canvas?jid=0199716c-38f4-4409-affd-a1c35d55bc0e

[2025-09-22T14:14:38Z] (EngineCore_DP0 pid=12418) NotImplementedError: FlashAttention does not support fp8 kv-cache on this device.
Screenshot 2025-09-22 at 11 25 06 AM

Test Plan

Test Result

pytest -s -v "tests/quantization/test_fp8.py::test_kv_cache_model_load_and_run[False-neuralmagic/Meta-Llama-3-8B-Instruct-FP8-KV]" works on L40s


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

@mgoin mgoin changed the title Fix fp8 kv cache on <SM90 [CI Failure] Fix fp8 kv cache on <SM90 Sep 22, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request aims to fix an issue where FlashAttention was incorrectly chosen for fp8 kv-cache on devices with compute capability less than 9.0, leading to a NotImplementedError. The change correctly modifies the attention backend selection logic to fall back to the Triton backend in this scenario. My main concern is that a pre-check in is_kv_cache_dtype_supported might still fail because it uses a simplified logic that doesn't account for this new fallback, potentially preventing the fix from being effective. I've left a specific comment detailing this critical issue. This should be addressed to ensure the fix is complete.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codex Review: Here are some suggestions.

Reply with @codex fix comments to fix any unresolved comments.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

Signed-off-by: mgoin <[email protected]>
@mgoin mgoin added ready ONLY add when PR is ready to merge/full CI is needed ci-failure Issue about an unexpected test failure in CI labels Sep 22, 2025
@simon-mo simon-mo enabled auto-merge (squash) September 22, 2025 17:41
@simon-mo simon-mo merged commit 239ef0c into vllm-project:main Sep 22, 2025
55 of 56 checks passed
@mgoin mgoin deleted the fix-fp8-kv-cache-sm80 branch September 22, 2025 18:36
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
charlifu pushed a commit to ROCm/vllm that referenced this pull request Sep 25, 2025
yewentao256 pushed a commit that referenced this pull request Oct 3, 2025
gjc0824 pushed a commit to gjc0824/vllm that referenced this pull request Oct 10, 2025
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
choprahetarth pushed a commit to Tandemn-Labs/vllm that referenced this pull request Oct 11, 2025
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
rtourgeman pushed a commit to rtourgeman/vllm that referenced this pull request Nov 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci-failure Issue about an unexpected test failure in CI ready ONLY add when PR is ready to merge/full CI is needed

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants