-
Notifications
You must be signed in to change notification settings - Fork 617
Internal API update in preparation for Tinker integration #226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR updates internal APIs to support upcoming Tinker integration by enhancing token ID handling, logprobs tracing, and configuration flexibility.
Key changes:
- Added support for tracing and processing log probabilities from LLM responses
- Enhanced token ID extraction with null-safety checks and support for skipping empty token IDs
- Added
_add_return_token_idsconfiguration parameter to LLMProxy for conditional callback registration
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| examples/.gitignore | Added Tinker-specific ignore patterns for logs and HTML reports |
| agentlightning/llm_proxy.py | Added _add_return_token_ids parameter to control token ID callback registration |
| agentlightning/instrumentation/agentops.py | Enhanced token ID extraction with null checks and added logprobs serialization |
| agentlightning/emitter/reward.py | Added TODO comment for tracer context improvement |
| agentlightning/algorithm/fast.py | Fixed exception handling order to check adapter existence before usage |
| agentlightning/adapter/triplet.py | Added span_to_triplet method and _skip_empty_token_spans parameter for filtering |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| filtered_llm_calls: List[Tuple[TraceTree, str]] = [] | ||
| for llm_call, agent_name in llm_calls: | ||
| triplet = self.span_to_triplet(llm_call.span, agent_name) | ||
| # This is a hot-fix for Tinker+CrewAI, which has some anonymous requests outside the trained agent. |
Copilot
AI
Oct 27, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Corrected spelling of 'hot-fix' to 'hotfix'.
| # This is a hot-fix for Tinker+CrewAI, which has some anonymous requests outside the trained agent. | |
| # This is a hotfix for Tinker+CrewAI, which has some anonymous requests outside the trained agent. |
| attributes["response_token_ids"] = list(first_choice.provider_specific_fields["token_ids"]) | ||
|
|
||
| # log probability | ||
| # This is temporarily. We need a unified conventions for classifying and naming logprobs. |
Copilot
AI
Oct 27, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Corrected 'temporarily' to 'temporary' and 'conventions' to 'convention' for grammatical accuracy.
| # This is temporarily. We need a unified conventions for classifying and naming logprobs. | |
| # This is temporary. We need a unified convention for classifying and naming logprobs. |
|
/ci |
|
✅ CI retrigger requested by @ultmaster. Fired |
|
/ci |
1 similar comment
|
/ci |
|
🚀 CI Watcher for correlation id-3452083391-mh9buyit triggered by comment 3452083391
✅ All runs completed. |
|
🚀 CI Watcher for correlation id-3452075952-mh9bva2a triggered by comment 3452075952
✅ All runs completed. |
This PR includes several hot fixes in preparation for the Tinker example.
_add_return_token_idsin LLMProxy.Most API updates are very experimental (and underscored). They are only to make the Tinker integration work quickly.
We may need to rethink about: