A better way to handle Agent stream output for client display #774
KanchiShimono
started this conversation in
General
Replies: 1 comment
-
|
I'm having the same question. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Context
I want to return the Agent’s (LLM’s) stream output to the client token by token for users of my chat UI.
From what I understand, the most straightforward way to achieve this using the A2A protocol is to include the tokens in
TaskStatusUpdateEvent.status.messagein response to amessage/streamrequest.However, the current Python and TypeScript SDK implementations have the following issues:
Task.historyrecords the history token by token, making it cumbersome to reconstruct conversation history across multiple turns.TaskStore, which leads to heavy write load when using a database-backedTaskStore.I believe it’s preferable to store conversation history as a single message per complete Agent response.
Question
MessageDelta) that does not update task state?Beta Was this translation helpful? Give feedback.
All reactions