Environment Information
Windows 10 & Linux / venv: lazyllm
Reproduction Steps
SenseNova-V6-5-Turbo is recognized as a VLM (Vision-Language Model) in LazyLLM, while OnlineChatModule assumes by default that it is an LLM (pure Language Model). The mismatch between the two types triggers the assertion error.
Expected Behavior
It should detect that it is a VLM type inside OnlineChatModule.
Actual Behavior
But in reality, it is detected as an LLM type inside OnlineChatModule and as a VLM type inside OnlineChatModuleBase, which ultimately leads to the error.
Screenshots / Logs
Example code:

Error:
