Skip to content

Comments

[Feat] Rename AsyncOmniLLM -> AsyncOmni#103

Merged
Gaohan123 merged 2 commits intovllm-project:mainfrom
congw729:rename_asyncomni
Nov 30, 2025
Merged

[Feat] Rename AsyncOmniLLM -> AsyncOmni#103
Gaohan123 merged 2 commits intovllm-project:mainfrom
congw729:rename_asyncomni

Conversation

@congw729
Copy link
Contributor

@congw729 congw729 commented Nov 29, 2025

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

Rename: AsyncOmniLLMAsyncOmni and async_omni_llm.pyasync_omni.py

This refactoring simplifies the naming convention by removing the redundant "LLM" suffix from the class name and shortening the module filename.

Changes:

  • Class name: AsyncOmniLLMAsyncOmni
  • Module file: vllm_omni/entrypoints/async_omni_llm.pyasync_omni.py
  • Module path: vllm_omni.entrypoints.async_omni_llmvllm_omni.entrypoints.async_omni

Scope of modifications:

  • Core class definition and docstrings
  • All import statements across the codebase
  • Public API exports in __init__.py
  • Function names and variable names containing async_omni_llm fragments
  • Documentation (API references, design docs, configuration guides)
  • Example code and README files
  • Test fixtures and mock paths
  • Build/packaging metadata

Impact:

  • Breaking change for external code importing AsyncOmniLLM or using the old module path
  • All internal references have been updated consistently
  • No functional changes; purely a naming refactoring

Test Plan

  • build & install the whl
./scripts/build_wheel.sh --create-venv
uv venv --python 3.12 --seed .venv-release-test
source .venv-release-test/bin/activate
uv pip install --reinstall "vllm==0.11.0" && uv pip install dist/vllm_omni-0.11.0rc1-py3-none-any.whl
python -c "import vllm_omni; from vllm_omni import Omni; print(vllm_omni.__version__)"
  • pytest
pytest tests/test_omni_llm.py
  • mkdoc
uv pip install -e ".[docs]"
mkdocs build
mkdocs serve
  • examples/online_serving/readme.md

Test Result

All passed.
Documentation built in 11.80 seconds


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft.

BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)

@congw729 congw729 changed the title Rename AsyncOmniLLM -> AsyncOmni [WIP] Rename AsyncOmniLLM -> AsyncOmni Nov 29, 2025
@congw729 congw729 marked this pull request as draft November 29, 2025 04:16
@congw729 congw729 marked this pull request as ready for review November 29, 2025 06:13
@congw729 congw729 changed the title [WIP] Rename AsyncOmniLLM -> AsyncOmni [Feat] Rename AsyncOmniLLM -> AsyncOmni Nov 29, 2025
@congw729 congw729 requested a review from Gaohan123 November 29, 2025 06:20
@congw729
Copy link
Contributor Author

This PR is ready for merge. @Gaohan123 Pls take a look.

@hsliuustc0106
Copy link
Collaborator

solve the conflicts please

Copy link
Collaborator

@hsliuustc0106 hsliuustc0106 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

resolve the confict and get it merged asap

@congw729
Copy link
Contributor Author

resolve the confict and get it merged asap

Done, pls have a look

Signed-off-by: WANG Cong <[email protected]>
Copy link
Collaborator

@Gaohan123 Gaohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix them

Copy link
Collaborator

@Gaohan123 Gaohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Comments left to modify later

@Gaohan123 Gaohan123 merged commit 3427fb4 into vllm-project:main Nov 30, 2025
3 checks passed
princepride pushed a commit to princepride/vllm-omni that referenced this pull request Jan 10, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants