-
Notifications
You must be signed in to change notification settings - Fork 1.2k
fix: update responses limitations doc to track latest state #4392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
3a9d560 to
050457b
Compare
050457b to
25e7e10
Compare
s-akhtar-baig
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @iamemilio, thank you for updating the document!
Can we please update the include["message.output_text.logprobs"]? The relevant PR (#4261) was merged last week.
a477a77 to
21ef0fb
Compare
|
|
||
| This change adds all the following parameters to inputs and output objects for responses and chat completions in Llama Stack for full API compatibility. | ||
| It also implements `message.output_text.logprobs`, allowing users to get logprobs outputs from their inferencing requests. *Note* that the other fields are | ||
| just stubs for now, and will not work when invoked since Llama Stack does not yet support built in tools. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, Emilio!
Just wondering whether it would make sense to set the status of this section to "partial implementation" or something along those lines. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree with @s-akhtar-baig and in particular would like to see something about the parts that are still missing listed in the open issues section and a note here saying that message.output_text.logprobs is now working even though the other values for this parameter are not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me instead make it clear that those values are now accepted as API fields, but are not implemented. I can then link it back to the issue that is still pending.
jwm4
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall this looks very helpful. I have a few editorial comments.
|
|
||
| This change adds all the following parameters to inputs and output objects for responses and chat completions in Llama Stack for full API compatibility. | ||
| It also implements `message.output_text.logprobs`, allowing users to get logprobs outputs from their inferencing requests. *Note* that the other fields are | ||
| just stubs for now, and will not work when invoked since Llama Stack does not yet support built in tools. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree with @s-akhtar-baig and in particular would like to see something about the parts that are still missing listed in the open issues section and a note here saying that message.output_text.logprobs is now working even though the other values for this parameter are not.
| > The type of the web search tool. One of `web_search` or `web_search_2025_08_26`. | ||
| **Issue:** [#3321](https://github.com/llamastack/llama-stack/issues/3321) | ||
| Llama Stack now supports both `web_search` and `web_search_2025_08_26` types, matching OpenAI's API. For backward compatibility, Llama Stack also supports `web_search_preview` and `web_search_preview_2025_03_11` types. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems confusing now. I think it would make sense to split this into the parts that are still open (filters and user location) that stay here, and the parts that are now done (the aliases) in the Resolved Issues section, which I think requires some rewriting of both.
|
thanks for the PR @iamemilio |
s-akhtar-baig
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, thanks, @iamemilio!
jwm4
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All my concerns have been addressed. This looks good to me now. Thanks!
What does this PR do?
Update the Responses limitations doc to track is status and gaps