Skip to content

Comments

[Doc]Format profiling doc#993

Merged
hsliuustc0106 merged 7 commits intovllm-project:mainfrom
lishunyang12:doc_patch
Jan 28, 2026
Merged

[Doc]Format profiling doc#993
hsliuustc0106 merged 7 commits intovllm-project:mainfrom
lishunyang12:doc_patch

Conversation

@lishunyang12
Copy link
Contributor

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

This PR aims to format profiling page for better guideline and clear instructions.
@hsliuustc0106

Test Plan

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft.

BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)

Signed-off-by: lishunyang <[email protected]>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Formats and updates the profiling documentation to provide clearer guidance for profiling vLLM-Omni omni-modality and diffusion workflows.

Changes:

  • Updates terminology and section headings for omni-modality profiling.
  • Renames model examples (Qwen2.5-Omni / Qwen3-Omni) and restructures diffusion profiling into its own section.
  • Removes the async/online profiling section and updates the external vLLM profiling guide link.
Comments suppressed due to low confidence (1)

docs/contributing/profiling.md:90

  • In this diffusion profiling section, the heading uses sentence case and the CLI example is fenced as python even though it’s a shell command block. Please switch the fence to bash (and consider using Title Case for the heading to match the rest of the document).
### 3. Profiling diffusion models

Diffusion profiling is End-to-End, capturing encoding, denoising loops, and decoding.

**CLI Usage:**
```python

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

As of now, asynchronous (online) profiling is not fully supported in vLLM-Omni. While start_profile() and stop_profile() methods exist, they are only reliable in offline inference scripts (e.g., the provided end2end.py examples). Do not use them in server-mode or streaming scenarios—traces may be incomplete or fail to flush.

**Online Inference(Async)**

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gcanlin
I recall that Omni pipeline supports profile in asyncOmni.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AsyncOmni's methods support profiling but the it has not been validated in examples. We will update it in a separate PR given than online serving profiling is less common than offline one.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

online serving profiling is less common than offline one.

I don't think so.

@david6666666 david6666666 added the ready label to trigger buildkite CI label Jan 28, 2026
@david6666666 david6666666 enabled auto-merge (squash) January 28, 2026 12:13
@hsliuustc0106 hsliuustc0106 merged commit b11d436 into vllm-project:main Jan 28, 2026
4 of 5 checks passed
dongbo910220 pushed a commit to dongbo910220/vllm-omni that referenced this pull request Feb 1, 2026
Signed-off-by: lishunyang <[email protected]>
Signed-off-by: Hongsheng Liu <[email protected]>
Co-authored-by: Hongsheng Liu <[email protected]>
Co-authored-by: Copilot <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready label to trigger buildkite CI

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants