fix no max_tokens in payload when the vision model name does not cont…#5304
fix no max_tokens in payload when the vision model name does not cont…#5304Dogtiti merged 1 commit intoChatGPTNextWeb:mainfrom
Conversation
|
@dustookk is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe recent modifications to the Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
|
Your build has completed! |
There was a problem hiding this comment.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- app/client/platforms/openai.ts (1 hunks)
Additional comments not posted (1)
app/client/platforms/openai.ts (1)
193-194: LGTM! Simplified condition formax_tokens.The removal of the additional condition ensures that
max_tokensis set for all vision models, improving flexibility and aligning with the PR objectives.
…ain 'vision'.
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
Fix no 'max_tokens' in the payload when the vision model name does not contain 'vision'
📝 补充信息 | Additional Information
&& modelConfig.model.includes("preview")会导致常规 vision model 走不进 if 代码块,从而无法设置max_tokens本地出现的bug是上传图片后,openai回复断流。 删除
&&后正常Summary by CodeRabbit
New Features
max_tokensparameter in the vision model requests, enhancing compatibility across various model configurations.Bug Fixes
max_tokensis applied more consistently.