Skip to content

docs(readme,examples): Update readme example, format examples#71

Open
sinderpl wants to merge 2 commits intocharmbracelet:mainfrom
sinderpl:chore-update-readme-example
Open

docs(readme,examples): Update readme example, format examples#71
sinderpl wants to merge 2 commits intocharmbracelet:mainfrom
sinderpl:chore-update-readme-example

Conversation

@sinderpl
Copy link
Copy Markdown

  • I have read CONTRIBUTING.md.
  • I have created a discussion that was approved by a maintainer (for new features).

@sinderpl sinderpl changed the title Chore update readme example docs(readme,examples): Update readme example, format examples Nov 10, 2025
@sinderpl
Copy link
Copy Markdown
Author

sinderpl commented Nov 10, 2025

cc @meowgorithm as per #60 (comment)
Let me know if you want me to take out the example changes and keep it to the readme only

@kujtimiihoxha
Copy link
Copy Markdown
Contributor

@meowgorithm do we want to keep the existing readme example or change it?

birdmanmandbir added a commit to tta-lab/fantasy that referenced this pull request May 6, 2026
Three changes to bring fantasy's Responses API streaming into parity
with OpenAI's official codex CLI (codex-rs/codex-api/src/sse/responses.rs):

1. **Defer activeReasoning cleanup to end-of-stream.** The previous
   implementation deleted the entry on `response.output_item.done`,
   which meant any subsequent event for the same reasoning item (e.g.
   late delta or duplicate done) silently dropped. The codex official
   parser keeps items addressable until the full stream completes.

2. **Capture done.Item.Summary on output_item.done.** The streaming
   summary delta path may already populate state.metadata.Summary via
   reasoning_summary_text.delta events, but the done event carries the
   authoritative final list. Prefer it when non-empty so partial-summary
   streams are corrected to the final shape.

3. **Add response.reasoning_text.delta handler.** Some gpt-5.x reasoning
   variants stream reasoning via this event channel (raw reasoning text
   keyed by ItemID + ContentIndex) instead of, or in addition to,
   reasoning_summary_text.delta. The official codex parser handles both;
   fantasy previously only handled the summary path, dropping raw
   reasoning text for affected models.

Background: empirical lenos session 35dd39ec (codex 5.4 multi-turn
audit) showed turn 1 captured encrypted_content cleanly via the
existing output_item.done capture (PR charmbracelet#71's Fix 2), but follow-up
turns and gpt-5.5 high sessions (ab022528) showed
state.metadata.EncryptedContent stuck at empty despite the API
streaming reasoning text. Investigation against the official codex CLI
source + multiple reverse-engineered backend proxies (MetaFARS/codex-relay,
hermes-agent issue #5732, satoriweb's protocol notes) confirmed:

- response.completed.output is unreliable on the Codex backend (can be
  empty even when output_item.done events delivered the data).
- The reasoning_text.delta event is a separate channel from
  reasoning_summary_text.delta; both must be handled to capture all
  thinking text emitted by gpt-5.x reasoning variants.

This commit reverts the Fix 3 attempt (commit 7ce6466 — re-emitting
ReasoningEnd from response.completed.output) which was based on the
incorrect assumption that completed.output is the source of truth.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants