You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/providers/openai_responses_limitations.mdx
+90-64Lines changed: 90 additions & 64 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,41 +5,12 @@ sidebar_label: Limitations of Responses API
5
5
sidebar_position: 1
6
6
---
7
7
8
-
## Unresolved Issues
8
+
## Issues
9
9
10
10
This document outlines known limitations and inconsistencies between Llama Stack's Responses API and OpenAI's Responses API. This comparison is based on OpenAI's API and reflects a comparison with the OpenAI APIs as of October 6, 2025 (OpenAI's client version `openai==1.107`).
11
11
See the OpenAI [changelog](https://platform.openai.com/docs/changelog) for details of any new functionality that has been added since that date. Links to issues are included so readers can read about status, post comments, and/or subscribe for updates relating to any limitations that are of specific interest to them. We would also love any other feedback on any use-cases you try that do not work to help prioritize the pieces left to implement.
12
12
Please open new issues in the [meta-llama/llama-stack](https://github.com/meta-llama/llama-stack) GitHub repository with details of anything that does not work that does not already have an open issue.
13
13
14
-
### Instructions
15
-
**Status:** Partial Implementation + Work in Progress
Streaming functionality for the Responses API is partially implemented and does work to some extent, but some streaming response objects that would be needed for full compatibility are still missing.
OpenAI's platform supports [templated prompts using a structured language](https://platform.openai.com/docs/guides/text?api-mode=responses#reusable-prompts). These templates can be stored server-side for organizational sharing. This feature is under development for Llama Stack.
40
-
41
-
---
42
-
43
14
### Web-search tool compatibility
44
15
45
16
**Status:** Partial Implementation
@@ -111,19 +82,9 @@ In OpenAI's API, the `tool_choice` parameter allows you to set restrictions or r
111
82
112
83
**Status:** Not Implemented
113
84
114
-
OpenAI's platform allows users to track agentic users using a safety identifier passed with each response. When requests violate moderation or safety rules, account holders are alerted and automated actions can be taken. This capability is not currently available in Llama Stack.
115
-
116
-
---
117
-
118
-
### Connectors
119
-
120
-
**Status:** Not Implemented
121
-
122
-
Connectors are MCP servers maintained and managed by the Responses API provider. OpenAI has documented their connectors at [https://platform.openai.com/docs/guides/tools-connectors-mcp](https://platform.openai.com/docs/guides/tools-connectors-mcp).
- Should Llama Stack include built-in support for some, all, or none of OpenAI's connectors?
126
-
- Should there be a mechanism for administrators to add custom connectors via `config.yaml` or an API?
87
+
OpenAI's platform allows users to track agentic users using a safety identifier passed with each response. When requests violate moderation or safety rules, account holders are alerted and automated actions can be taken. This capability is not currently available in Llama Stack.
127
88
128
89
---
129
90
@@ -156,16 +117,6 @@ It enables users to also get logprobs for alternative tokens.
The Responses API can accept a `max_tool_calls` parameter that limits the number of tool calls allowed to be executed for a given response. This feature needs full implementation and documentation.
166
-
167
-
---
168
-
169
120
### Max Output Tokens
170
121
171
122
**Status:** Not Implemented
@@ -186,16 +137,6 @@ The return object from a call to Responses includes a field for indicating why a
Connectors are MCP servers maintained and managed by the Responses API provider. OpenAI has documented their connectors at [https://platform.openai.com/docs/guides/tools-connectors-mcp](https://platform.openai.com/docs/guides/tools-connectors-mcp).
227
+
228
+
**Open Questions:**
229
+
- Should Llama Stack include built-in support for some, all, or none of OpenAI's connectors?
230
+
- Should there be a mechanism for administrators to add custom connectors via `config.yaml` or an API?
OpenAI's platform supports [templated prompts using a structured language](https://platform.openai.com/docs/guides/text?api-mode=responses#reusable-prompts). These templates can be stored server-side for organizational sharing.
0 commit comments