Skip to content

Conversation

@jldec
Copy link
Owner

@jldec jldec commented Aug 24, 2025

Trying out ai sdk v5 with this PR 391 from cloudflare agents sdk.

Testing with locally installed agents build from the v5-migration branch.. (see file: link in package.json)

  • basic support with Chat Agents SDK (works fine with openai - works with cloudflare workers ai using Version Packages cloudflare/ai#269)
  • support tools and subagents with Chat Agents Agents (works now 🎉)
  • render tool responses (currently show json)
  • capture subagent responses

@jldec
Copy link
Owner Author

jldec commented Aug 24, 2025

note - can't use cloudflare ai models yet - see cloudflare/ai#173

failure message

Uncaught TypeError: ReadableStreamDefaultController enqueue: Cannot enqueue into a stream that is already closed.

Screenshot 2025-08-24 at 18 52 16

root cause in type:error chunk

Screenshot 2025-08-24 at 18 28 18

@jldec
Copy link
Owner Author

jldec commented Aug 25, 2025

Known problems - the first two possibly caused by incorrect implementation in this PR 😝

  • tool call responses show up temporarily but are not persisted (disappear on reload)
  • subagent calls which use this.saveMessages() return immediately, instead of waiting for the subagent response to finish like before. Suspect issue with _drainStream()
  • installing MCP tools results in errors (to fix s/parameters/inputSchema/ here)
    Invalid schema for function 'tool_xxx on server
    ReadableStreamDefaultController enqueue: Cannot enqueue into a stream that is already closed. on client - (caused by type:'error' in data closing the stream)

@threepointone
Copy link

cc @whoiskatrin

@jldec
Copy link
Owner Author

jldec commented Aug 26, 2025

@whoiskatrin after the fixes in cloudflare/agents@72df075, the MCP tools are working, but now response messages appear to be persisted as raw sse data. Was that intentional?

e.g. when running guides/human-in-the-loop

Screenshot 2025-08-26 at 14 09 16

@jldec
Copy link
Owner Author

jldec commented Aug 26, 2025

type:'error' message data from the server still produces that client side error during the translation of websocket response messages back into the fetch response stream.

E.g. if there is a problem with the model

Screenshot 2025-08-26 at 14 15 46

@jldec
Copy link
Owner Author

jldec commented Aug 26, 2025

many of the issues above have been resolved in recent PR commits.
current issues outstanding:

  • tool call responses and error responses are returned as one big "text" type message with raw sse datastream response as the text value (instead of UIMessage types). See screenshots below.

  • streaming text responses arrive in text chunks over websockets, but do not trigger streaming updates in the UI

  • calling await this.saveMessages() still returns without waiting for the full response stream from onChatMessage to be drained.

Screenshot 2025-08-26 at 18 44 10 Screenshot 2025-08-26 at 18 40 50

@jldec
Copy link
Owner Author

jldec commented Aug 27, 2025

Response streaming to the client is working much better now - thanks @whoiskatrin 🙏

The only outstanding issue I see is that error reponses are not being caught in the data.error path on the client:
Screenshot 2025-08-27 at 07 27 36

TODO (on this end)

  • render messages properly (not JSON) for text and tool calls
  • saveMessages() now returns the response stream which is great. Figure out a way to stream that back from the subagent newMessage() call using streamig tools call updates.

@whoiskatrin
Copy link

@jldec could you check again please, just in case, I just pushed some changes a minute ago

@jldec
Copy link
Owner Author

jldec commented Aug 27, 2025

@whoiskatrin now I'm seeing tool call and error responses as type:"text" blobs with the multi-line sse data stream on the client (like earlier).

Screenshot 2025-08-27 at 09 11 12

Also, client-side streaming UI stopped working

@jldec
Copy link
Owner Author

jldec commented Aug 27, 2025

Update as of cloudflare/agents#391 (comment)

  • streaming is working
  • type:error message triggers client-side throw
  • responses are not being persisted (they show up in client, but disappear after reload from server)

@jldec
Copy link
Owner Author

jldec commented Aug 28, 2025

As of cloudflare/agents@ec0ca29

Sending back just the text-deltas from the server side prevents tool call results from arriving at the client.

from guides/human-in-the-loop
Screenshot 2025-08-28 at 10 20 47

@threepointone
Copy link

can you try workers-ai-provider@beta for workers ai models and tell me if works for you? please and thank you!

@threepointone
Copy link

threepointone commented Sep 3, 2025

sorry, I mean via npm i https://pkg.pr.new/cloudflare/ai/workers-ai-provider@269

@jldec
Copy link
Owner Author

jldec commented Sep 3, 2025

@threepointone
Copy link

Awesome! Released

@jldec
Copy link
Owner Author

jldec commented Sep 10, 2025

works - and agents v0.1.0 shipped today - merging with JSON tool responses, and without capturing subagent responses.

@jldec jldec marked this pull request as ready for review September 10, 2025 22:59
@jldec jldec merged commit e656d8d into main Sep 10, 2025
@jldec jldec deleted the ai-sdk-v5 branch September 10, 2025 22:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants