Skip to content

tool support #41

@danny70437

Description

@danny70437

Feature Request: Tool Support

Problem

Hi Vitalii,

we are in production now. While testing the system with native function calls (tool support), I found out that this doesn't work. I dug deeper into your code and saw that the tools keyword is filtered out. Also, the response was changed so it doesn't support the tool feature.

Is there a reason why you removed the tools support from lm-proxy?

Best regards,
Daniel

Details

Request Side (Client → Proxy → LLM)

The ChatCompletionRequest class in base_types.py currently contains these fields:

  • model, messages, stream, max_tokens, temperature, top_p, n, stop
  • presence_penalty, frequency_penalty, user

Missing fields:

  • tools - List of tools available for the LLM
  • tool_choice - Controls how the LLM should use tools
  • Other OpenAI-compatible parameters (logit_bias, logprobs, seed, etc.)

Response Side (LLM → Proxy → Client)

The response from the LLM contains tool call information in the __dict__ of the LLMResponse object (e.g., tool_calls), but this is currently not correctly forwarded to the client.

Expected Behavior

  1. Request: All OpenAI-compatible parameters (including tools, tool_choice) must be forwarded unchanged to the LLM
  2. Response: Tool call information must be forwarded to the client in the response

Reference

OpenAI Chat Completions API: https://platform.openai.com/docs/api-reference/chat

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions