Skip to content

[Bug]: an error occurred while passing the type of vlm #856

@chenzhex

Description

@chenzhex

Environment Information

Windows 10 & Linux / venv: lazyllm

Reproduction Steps

SenseNova-V6-5-Turbo is recognized as a VLM (Vision-Language Model) in LazyLLM, while OnlineChatModule assumes by default that it is an LLM (pure Language Model). The mismatch between the two types triggers the assertion error.

Expected Behavior

It should detect that it is a VLM type inside OnlineChatModule.

Actual Behavior

But in reality, it is detected as an LLM type inside OnlineChatModule and as a VLM type inside OnlineChatModuleBase, which ultimately leads to the error.

Screenshots / Logs

Example code:
Image

Error:
Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions