Skip to content

[Bug]: AttributeError: 'EmbeddingChatRequest' object has no attribute 'mm_processor_kwargs' #15048

@hxyghostor

Description

@hxyghostor

Your current environment

when I deploy dse-qwen2-2b-mrl-v1 embedding model, assert [Bug]: AttributeError: 'EmbeddingChatRequest' object has no attribute 'mm_processor_kwargs'
File "/home/tiger/.local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/applications.py", line 112, in call await self.middleware_stack(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in call raise exc File "/home/tiger/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in call await self.app(scope, receive, _send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in call await self.app(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app raise exc File "/home/tiger/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app await app(scope, receive, sender) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/routing.py", line 714, in call await self.middleware_stack(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/routing.py", line 734, in app await route.handle(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle await self.app(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/routing.py", line 76, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app raise exc
File "/home/tiger/.local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender) File "/home/tiger/.local/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
response = await f(request)
^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/vllm/entrypoints/utils.py", line 56, in wrapper
return handler_task.result()
^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 475, in create_embedding
generator = await handler.create_embedding(request, raw_request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/vllm/entrypoints/openai/serving_embedding.py", line 118, in create_embedding
) = await self._preprocess_chat(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/vllm/entrypoints/openai/serving_engine.py", line 454, in _preprocess_chat
if request.mm_processor_kwargs is not None:
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/tiger/.local/lib/python3.11/site-packages/pydantic/main.py", line 891, in getattr
raise AttributeError(f'{type(self).name!r} object has no attribute {item!r}')
AttributeError: 'EmbeddingChatRequest' object has no attribute 'mm_processor_kwargs'

🐛 Describe the bug

server code
vllm serve /dse-qwen2-2b-mrl-v1 --port $PORT0 --host 0.0.0.0 --max-model-len 4096 --task embed --chat-template /server_vllm/template_dse_qwen2_vl.jinja

client code
api_url = f"http://localhost:{vllm_port}/v1/embeddings"

    # image_path = "/server_vllm/image_a7ec7349586043edb90e8601daa39286.png"
    # with open(image_path, "rb") as f:
    #     encoded_image = base64.b64encode(f.read())
    # encoded_image_text = encoded_image.decode("utf-8")
    
    # base64_qwen = f"data:image;base64,{image}"

data = {
"model" : "/dse-qwen2-2b-mrl-v1",
"messages":[
{"role": "system", "content": "You are a helpful assistant."},
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": base64_qwen,
"min_pixels": 42828,
"max_pixels": 2562828,
},
},
{"type": "text",
"text": "..."},
],
},
],
"encoding_format": "float",
}

    response = requests.post(api_url, json=data)
    response.raise_for_status()
    chat_response = response.json()
    print("Chat response:", chat_response)

I follow https://github.com/vllm-project/vllm/blob/main/examples/online_serving/openai_chat_embedding_client_for_multimodal.py and https://docs.vllm.ai/en/stable/serving/multimodal_inputs.html#multimodal-inputs
I have no idea what happened

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions