Skip to content

Conversation

@iamemilio
Copy link
Contributor

What does this PR do?

Update the Responses limitations doc to track is status and gaps

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Dec 15, 2025
@iamemilio iamemilio force-pushed the responses_limitations branch 3 times, most recently from 3a9d560 to 050457b Compare December 15, 2025 16:21
Copy link
Contributor

@s-akhtar-baig s-akhtar-baig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @iamemilio, thank you for updating the document!

Can we please update the include["message.output_text.logprobs"]? The relevant PR (#4261) was merged last week.

@iamemilio iamemilio force-pushed the responses_limitations branch from a477a77 to 21ef0fb Compare December 17, 2025 15:38

This change adds all the following parameters to inputs and output objects for responses and chat completions in Llama Stack for full API compatibility.
It also implements `message.output_text.logprobs`, allowing users to get logprobs outputs from their inferencing requests. *Note* that the other fields are
just stubs for now, and will not work when invoked since Llama Stack does not yet support built in tools.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, Emilio!

Just wondering whether it would make sense to set the status of this section to "partial implementation" or something along those lines. What do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree with @s-akhtar-baig and in particular would like to see something about the parts that are still missing listed in the open issues section and a note here saying that message.output_text.logprobs is now working even though the other values for this parameter are not.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me instead make it clear that those values are now accepted as API fields, but are not implemented. I can then link it back to the issue that is still pending.

Copy link
Contributor

@jwm4 jwm4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall this looks very helpful. I have a few editorial comments.


This change adds all the following parameters to inputs and output objects for responses and chat completions in Llama Stack for full API compatibility.
It also implements `message.output_text.logprobs`, allowing users to get logprobs outputs from their inferencing requests. *Note* that the other fields are
just stubs for now, and will not work when invoked since Llama Stack does not yet support built in tools.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree with @s-akhtar-baig and in particular would like to see something about the parts that are still missing listed in the open issues section and a note here saying that message.output_text.logprobs is now working even though the other values for this parameter are not.

> The type of the web search tool. One of `web_search` or `web_search_2025_08_26`.
**Issue:** [#3321](https://github.com/llamastack/llama-stack/issues/3321)
Llama Stack now supports both `web_search` and `web_search_2025_08_26` types, matching OpenAI's API. For backward compatibility, Llama Stack also supports `web_search_preview` and `web_search_preview_2025_03_11` types.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems confusing now. I think it would make sense to split this into the parts that are still open (filters and user location) that stay here, and the parts that are now done (the aliases) in the Resolved Issues section, which I think requires some rewriting of both.

@jaideepr97
Copy link
Contributor

thanks for the PR @iamemilio
I think we can also mention that connectors will also be supported via responses api soon, once #4263 lands

Copy link
Contributor

@s-akhtar-baig s-akhtar-baig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, thanks, @iamemilio!

Copy link
Contributor

@jwm4 jwm4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All my concerns have been addressed. This looks good to me now. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants