-
Notifications
You must be signed in to change notification settings - Fork 1.4k
chore(wren-ai-service): update deps #1892
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughUpdated wren-ai-service/pyproject.toml to remove the litellm "proxy" extra, keeping the version at ^1.75.2. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
wren-ai-service/pyproject.toml (1)
36-36: Action: confirm removal of litellm[proxy] extra — repo search found no in-repo proxy usage, but verify ops/runtime artifactsShort summary:
- I searched the repo for litellm.proxy imports and CLI invocations — no occurrences found.
- External lookup shows litellm[proxy] adds many server/infra deps (e.g., fastapi, uvicorn, redis, boto3, httpx, websockets) and enables the OpenAI‑compatible proxy/management features.
Files/places to double-check (recommended):
- wren-ai-service/pyproject.toml — verify the litellm line is intentionally without extras:
litellm = "^1.75.2"- Ops/runtime artifacts and build steps that may be outside source files or injected at build time:
- Dockerfiles / docker-compose.yml / container entrypoints
- Makefiles, shell scripts, README/deployment docs
- CI workflows (.github/workflows) and image build pipelines
- Kubernetes manifests or other deploy manifests
- Any runtime code that expects proxy features (key management, admin UI, OpenAI‑compatible gateway, caching/queue integrations).
Suggested actions:
- If the proxy features are required at runtime, re-add litellm[proxy] or explicitly list the needed packages (fastapi/uvicorn/redis/etc.) in pyproject.
- If not required, no code change needed beyond confirming the ops artifacts noted above.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
wren-ai-service/poetry.lockis excluded by!**/*.lock
📒 Files selected for processing (1)
wren-ai-service/pyproject.toml(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Analyze (go)
Summary by CodeRabbit