-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Feature Request: Tool Support
Problem
Hi Vitalii,
we are in production now. While testing the system with native function calls (tool support), I found out that this doesn't work. I dug deeper into your code and saw that the tools keyword is filtered out. Also, the response was changed so it doesn't support the tool feature.
Is there a reason why you removed the tools support from lm-proxy?
Best regards,
Daniel
Details
Request Side (Client → Proxy → LLM)
The ChatCompletionRequest class in base_types.py currently contains these fields:
- model, messages, stream, max_tokens, temperature, top_p, n, stop
- presence_penalty, frequency_penalty, user
Missing fields:
tools- List of tools available for the LLMtool_choice- Controls how the LLM should use tools- Other OpenAI-compatible parameters (logit_bias, logprobs, seed, etc.)
Response Side (LLM → Proxy → Client)
The response from the LLM contains tool call information in the __dict__ of the LLMResponse object (e.g., tool_calls), but this is currently not correctly forwarded to the client.
Expected Behavior
- Request: All OpenAI-compatible parameters (including
tools,tool_choice) must be forwarded unchanged to the LLM - Response: Tool call information must be forwarded to the client in the response
Reference
OpenAI Chat Completions API: https://platform.openai.com/docs/api-reference/chat
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working