Skip to content

Conversation

@toubatbrian
Copy link
Contributor

@toubatbrian toubatbrian commented Nov 6, 2025

Implement AgentHandoffItem into chat context

@changeset-bot
Copy link

changeset-bot bot commented Nov 6, 2025

⚠️ No Changeset found

Latest commit: d57c817

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link
Contributor

@simllll simllll left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the idea of adding it to the chat context and then filtering out again? Just curious about the reason behind this feature? 🤔

@toubatbrian
Copy link
Contributor Author

Hey @simllll, those events will be recorded and sent to the observability for traces. It's been filtered before passing to LLM since LLM does not support a "handoff" message type.


export interface AgentOptions<UserData> {
id?: string;
instructions: string;
Copy link
Contributor

@Shubhrakanti Shubhrakanti Nov 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this optional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also follows python agents:

class Agent:
    def __init__(
        self,
        *,
        instructions: str,
        id: str | None = None,
        chat_ctx: NotGivenOr[llm.ChatContext | None] = NOT_GIVEN,
        tools: list[llm.FunctionTool | llm.RawFunctionTool] | None = None,
        turn_detection: NotGivenOr[TurnDetectionMode | None] = NOT_GIVEN,
        stt: NotGivenOr[stt.STT | STTModels | str | None] = NOT_GIVEN,
        vad: NotGivenOr[vad.VAD | None] = NOT_GIVEN,
        llm: NotGivenOr[llm.LLM | llm.RealtimeModel | LLMModels | str | None] = NOT_GIVEN,
        tts: NotGivenOr[tts.TTS | TTSModels | str | None] = NOT_GIVEN,
        mcp_servers: NotGivenOr[list[mcp.MCPServer] | None] = NOT_GIVEN,
        allow_interruptions: NotGivenOr[bool] = NOT_GIVEN,
        min_consecutive_speech_delay: NotGivenOr[float] = NOT_GIVEN,
        use_tts_aligned_transcript: NotGivenOr[bool] = NOT_GIVEN,
        min_endpointing_delay: NotGivenOr[float] = NOT_GIVEN,
        max_endpointing_delay: NotGivenOr[float] = NOT_GIVEN,
    ) -> None:
        tools = tools or []
        if type(self) is Agent:
            self._id = "default_agent"
        else:
            self._id = id or misc.camel_to_snake_case(type(self).__name__)

@simllll
Copy link
Contributor

simllll commented Nov 6, 2025

@toubatbrian i see, thanks for the insights and the quick reply.

My thoughts: But if we are talking about opentelemetry support or similar, is it really necessary to put it in the chat context at all? Or would it be enough to "fake" this call or find some other way to make this traceable. (E.g. if we observe a special function for tracing, we could think about a flag or a second function just for logging purposes?)
The chat context right now is only for the llm, if this is not true anymore it opens the world for more "non related" llm output.
Not saying this doesn't make sense, but it's a different approach than there is right now.

@toubatbrian
Copy link
Contributor Author

Hey @simllll, I see your thoughts, which totally make sense! The main goal is to trying to achieve as much as parity as possible with the python agent framework. Adding handoff object into chat context is what currently been implemented in python side.

The other reason tie-ing the handoff to chat context is that, we'll add support for export the context json once a session is finished, so developer can do certain things on it such as eval. Having the agent handoff info as part of the chat context would be useful for that case.

@toubatbrian toubatbrian changed the title brianyin/ajs-319-agenthandoff-chat-item Add AgentHandoff Chat Item Nov 12, 2025
@toubatbrian toubatbrian merged commit 209661f into brian/wait-python-1.3 Nov 13, 2025
1 check passed
@toubatbrian toubatbrian deleted the brian/handoff-chat-item branch November 13, 2025 14:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants