Skip to content

Fix: Safe handling for multimodal_config to avoid 'NoneType' object h…#227

Merged
hsliuustc0106 merged 1 commit intovllm-project:mainfrom
qibaoyuan:main
Dec 7, 2025
Merged

Fix: Safe handling for multimodal_config to avoid 'NoneType' object h…#227
hsliuustc0106 merged 1 commit intovllm-project:mainfrom
qibaoyuan:main

Conversation

@qibaoyuan
Copy link
Contributor

@qibaoyuan qibaoyuan commented Dec 7, 2025

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

This PR fixes an issue where multimodal_config may be None, causing:

AttributeError: 'NoneType' object has no attribute '__dict__'

The original implementation assumes that multimodal_config is always an object with a dict attribute, but in real configurations it may be:

None

A dict

An object with attributes

This PR adds a safe, concise one-line implementation that supports all cases without changing existing behavior.

Test Plan

Initialize Omni model with:

multimodal_config = None

multimodal_config = {}

multimodal_config = {"foo": "bar"}

custom multimodal config objects

Test Result

All cases load successfully.


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft.

BEFORE SUBMITTING, PLEASE READ https://github.com/vllm-project/vllm-omni/blob/main/CONTRIBUTING.md (anything written below this line will be removed by GitHub Actions)

…as no attribute '__dict__' error

Fix: Safe handling for multimodal_config to avoid 'NoneType' object has no attribute '__dict__' error

Signed-off-by: Baoyuan Qi <[email protected]>
Copy link
Collaborator

@hsliuustc0106 hsliuustc0106 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, thx

@hsliuustc0106 hsliuustc0106 merged commit cfe7ad4 into vllm-project:main Dec 7, 2025
4 checks passed
LawJarp-A pushed a commit to LawJarp-A/vllm-omni that referenced this pull request Dec 12, 2025
LawJarp-A pushed a commit to LawJarp-A/vllm-omni that referenced this pull request Dec 12, 2025
faaany pushed a commit to faaany/vllm-omni that referenced this pull request Dec 19, 2025
princepride pushed a commit to princepride/vllm-omni that referenced this pull request Jan 10, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants