[Misc] Align error handling with upstream vLLM v0.14.0#1122
Conversation
- Remove TODO comments about vllm-specific validation errors - Pass exception objects directly to error response methods (str(e) -> e) - Ensures proper error response formatting with vLLM v0.14.0 Signed-off-by: anna <[email protected]>
There was a problem hiding this comment.
Pull request overview
This PR aligns error handling with upstream vLLM v0.14.0 by removing outdated TODO comments and passing exception objects directly to error handlers instead of converting them to strings first.
Changes:
- Removed 4 TODO comments about vLLM-specific validation errors
- Updated error handlers to pass exception objects directly (
einstead ofstr(e))
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| except ValueError as e: | ||
| # TODO: Use a vllm-specific Validation Error | ||
| return self.create_error_response(str(e)) | ||
| return self.create_error_response(e) |
There was a problem hiding this comment.
Inconsistent error handling: While this PR updates some error handlers to pass exception objects directly (lines 289, 317, 1287, 1312), there are similar error handlers in the same file that still use str(e):
- Line 634:
self.create_streaming_error_response(str(e))for RuntimeError - Line 646:
self.create_streaming_error_response(str(e))for Exception - Line 1485:
self.create_error_response(str(e))for RuntimeError - Line 1570:
self.create_error_response(str(e))for RuntimeError
For consistency with the upstream vLLM v0.14.0 alignment mentioned in the PR description, these should also be updated to pass the exception object directly (removing str() wrapper).
There was a problem hiding this comment.
@hsliuustc0106 Thanks for letting me know! I've updated the remaining call sites. a85dad9
Update remaining callers to pass exception objects directly instead of converting to string, consistent with the updated function signatures. Signed-off-by: anna <[email protected]>
f897f30 to
a85dad9
Compare
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 2 out of 2 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
LGTM |
PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.
Purpose
Align error handling with upstream vLLM v0.14.0.
When vLLM was updated to v0.14.0 in #878, corresponding error handling improvements were not applied to
OmniOpenAIServingChat. This PR completes the synchronization by:str(e)→e)Test Plan
Start vllm-omni server and send validation error requests:
Test Result
Error responses are properly formatted with correct HTTP status codes:
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)