Skip to content

Conversation

@MatthewBonanni
Copy link
Contributor

@MatthewBonanni MatthewBonanni commented Nov 3, 2025

In vllm-project/vllm#27683, CI is failing V1 Test attention (B200) because the export VLLM_DISABLE_FLASHINFER_PREFILL=1 command in test-pipeline.yaml is getting ignored. This PR ensures that export commands are run even when specific tests are extracted.

Failing:

commands:
    - export VLLM_DISABLE_FLASHINFER_PREFILL=1
    - pytest -v -s v1/attention

Workaround:

commands:
  - VLLM_DISABLE_FLASHINFER_PREFILL=1 pytest -v -s v1/attention

Signed-off-by: Matthew Bonanni <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant