[Docs] Add qwen image missing doc in user guide#111
Conversation
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| from vllm_omni.diffusion.worker.gpu_worker import GPUWorker, WorkerProc | ||
|
|
||
| __all__ = ["GPUWorker", "WorkerProc"] |
There was a problem hiding this comment.
Avoid importing GPUWorker in package init
The new vllm_omni.diffusion.worker.__init__ eagerly imports GPUWorker/WorkerProc, which in turn pull in vllm, torch, and other heavy dependencies. MkDocs’ api-autonav plugin (see mkdocs.yml’s modules: ["vllm_omni"]) imports each discovered package to render API docs; with only the [docs] extras installed (they don’t install vllm), a simple python -c "import vllm_omni.diffusion.worker" now raises ModuleNotFoundError and will cause mkdocs build to fail. Previously this subpackage had no __init__, so it wasn’t imported during docs generation. Consider making the import lazy or excluding this package from the docs scan to keep documentation builds working in environments without vllm.
Useful? React with 👍 / 👎.
| @@ -0,0 +1,66 @@ | |||
| # Offline Inference Example of vLLM-Omni for Qwen-Image | |||
There was a problem hiding this comment.
This is duplicated with examples/offline_inference/qwen_image/README.md.
There was a problem hiding this comment.
It looks like omni has 2 docs for ervry example now
There was a problem hiding this comment.
yes, but currently we need to add to the docs folder manually so as to display on readthedoc, similar to
https://github.com/vllm-project/vllm-omni/blob/main/docs/user_guide/examples/offline_inference/qwen2_5_omni.md
and
https://github.com/vllm-project/vllm-omni/blob/main/examples/offline_inference/qwen2_5_omni/README.md
There was a problem hiding this comment.
we should address the docs duplication issue in another PR. cc @Gaohan123
| @@ -0,0 +1,3 @@ | |||
| # SPDX-License-Identifier: Apache-2.0 | |||
There was a problem hiding this comment.
this is to address the warning in mkdoc building:
WARNING - api-autonav: Skipping implicit namespace package (without an __init__.py file) at /home/docs/checkouts/readthedocs.org/user_builds/vllm-omni/checkouts/latest/vllm_omni/diffusion/models/qwen_image. Set 'on_implicit_namespace_package' to 'skip' to omit it without warning.
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
Signed-off-by: samithuang <[email protected]>
|
Don't merge before #113 lands |
| - examples/README.md | ||
| - Offline Inference: | ||
| - Qwen2.5-Omni: user_guide/examples/offline_inference/qwen2_5_omni.md | ||
| - Qwen2.5-Image: user_guide/examples/offline_inference/qwen_image.md |
There was a problem hiding this comment.
| - Qwen2.5-Image: user_guide/examples/offline_inference/qwen_image.md | |
| - Qwen-Image: user_guide/examples/offline_inference/qwen_image.md |
Signed-off-by: samithuang <[email protected]>


PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.
Purpose
Add qwen image missing doc in user guide
Test Plan
Test Result
https://vllm-omni.readthedocs.io/en/latest/
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)