Skip to content

Comments

[Test] Add full test for Qwen3-Omni-30B-A3B-Instruct#720

Merged
hsliuustc0106 merged 15 commits intovllm-project:mainfrom
yenuo26:full_test
Jan 16, 2026
Merged

[Test] Add full test for Qwen3-Omni-30B-A3B-Instruct#720
hsliuustc0106 merged 15 commits intovllm-project:mainfrom
yenuo26:full_test

Conversation

@yenuo26
Copy link
Contributor

@yenuo26 yenuo26 commented Jan 9, 2026

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

This PR is intended to add full test for Qwen3-Omni-30B-A3B-Instruct
design and plan, please refer to the #723

Test Plan

python -m pytest -sv tests/e2e/online_serving/test_qwen3_omni_expansion.py::test_text_to_text_001
python -m pytest -sv tests/e2e/online_serving/test_qwen3_omni_expansion.py::test_text_to_text_audio_001

Test Result

2 passed, 2 warnings in 299.74s (0:04:59)

Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft.

BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)

wangyu31577 and others added 9 commits January 8, 2026 21:22
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 3023809d85

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +10 to +14
import psutil
import pytest
import torch
import whisper
import yaml

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Avoid import-time dependency on whisper/psutil

Because tests/conftest.py is imported for every test run, the top-level import whisper/import psutil will raise ImportError in any environment that doesn’t preinstall those packages, causing test collection to fail even when the new e2e tests aren’t selected. The repo’s pyproject.toml doesn’t declare these dependencies, so a default dev/CI setup is likely to hit this; consider moving the imports inside the helper functions/tests that need them or adding the packages to test/dev extras.

Useful? React with 👍 / 👎.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yenuo26 please pay attention to AI's review, this could help us improve efficiency and avoid obvious mistakes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yenuo26 please pay attention to AI's review, this could help us improve efficiency and avoid obvious mistakes

get

Comment on lines +198 to +199
output_path = f"{yaml_path.split('.')[0]}_{int(time.time())}.yaml"
with open(output_path, "w", encoding="utf-8") as f:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve dotted paths when writing modified YAML

Building the output filename with yaml_path.split('.')[0] drops everything after the first dot in the entire path, so a path like /tmp/.config/stage.yaml or config.v1.yaml will be rewritten into a different directory/name (e.g., /tmp/_<ts>.yaml), which can overwrite the wrong file or fail to be found later. Using Path(yaml_path).with_name(Path(yaml_path).stem + f"_{ts}").with_suffix(".yaml") (or similar) preserves the original directory and base name.

Useful? React with 👍 / 👎.

@yenuo26 yenuo26 changed the title Add full test for Qwen3-Omni-30B-A3B-Instruct [Test] Add full test for Qwen3-Omni-30B-A3B-Instruct Jan 9, 2026
@tzhouam
Copy link
Collaborator

tzhouam commented Jan 15, 2026

Should we consider a better way to manage variables such as concurrency size and message type? Hardcoding them inside the function impacts maintainability and increases the risk of missing or duplication in our test coverage.

@yenuo26
Copy link
Contributor Author

yenuo26 commented Jan 16, 2026

Should we consider a better way to manage variables such as concurrency size and message type? Hardcoding them inside the function impacts maintainability and increases the risk of missing or duplication in our test coverage.

I have added the following two functions to solve this problem.

def get_prompt(prompt_type="text_only"):
    prompts = {
        "text_only": "What is the capital of China?",
        "mix": "What is recited in the audio? What is in this image? Describe the video briefly.",
    }
    return prompts.get(prompt_type, prompts["text_only"])


def get_max_batch_size(size_type="few"):
    batch_sizes = {"few": 5, "medium": 100, "large": 256}
    return batch_sizes.get(size_type, 5)

Copy link
Collaborator

@tzhouam tzhouam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tzhouam tzhouam added the ready label to trigger buildkite CI label Jan 16, 2026
wangyu31577 added 2 commits January 16, 2026 18:30
Signed-off-by: wangyu31577 <[email protected]>
Signed-off-by: wangyu31577 <[email protected]>
pyproject.toml Outdated
"pytest-cov>=4.0.0",
"mypy==1.11.1",
"pre-commit==4.0.1",
"openai-whisper"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

any version?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added

Signed-off-by: wangyu31577 <[email protected]>
Copy link
Collaborator

@hsliuustc0106 hsliuustc0106 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, thx

@hsliuustc0106 hsliuustc0106 merged commit 4e23bff into vllm-project:main Jan 16, 2026
7 checks passed
erfgss pushed a commit to erfgss/vllm-omni that referenced this pull request Jan 19, 2026
Signed-off-by: wangyu31577 <[email protected]>
Co-authored-by: wangyu31577 <[email protected]>
Signed-off-by: Chen Yang <[email protected]>
with1015 pushed a commit to with1015/vllm-omni that referenced this pull request Jan 20, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready label to trigger buildkite CI

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants